Insegnamento a.a. 2024-2025

20603 - OPTIMIZATION

Department of Decision Sciences

Course taught in English

Class timetable
Exam timetable
Go to class group/s: 31
DSBA (8 credits - II sem. - OBCUR  |  SECS-S/06)
Course Director:
ANTONIO DE ROSA

Classes: 31 (II sem.)
Instructors:
Class 31: ANTONIO DE ROSA


Mission & Content Summary

MISSION

This course provides an overview of modern optimization methods in view of applications in machine learning, statistics and data science. In particular, it aims to form the modeling and thinking skills that students will need later on, during both their academic and professional careers. These skills will include how to formulate a decision-making problem as a mathematical optimization problem and how to develop an efficient algorithm to solve the resulting optimization problem. The course provides the skills to recognize and formulate a well-defined convex optimization problem, to develop efficient numerical methods to solve such problems, and to analyze and interpret the results.

CONTENT SUMMARY

Part 1: Theory

  • Mathematical background: multivariable calculus and linear algebra.

  • Convex sets: affine and convex sets, operations that preserve convexity, separating and supporting hyperplanes.

  • Convex functions: basic properties and examples, operations that preserve convexity, the conjugate function, quasiconvex functions, log-concave and log-convex functions.

  • Convex optimization problems: optimization problems, convex optimization, linear optimization problems, quadratic optimization problems.

  • Duality: the Lagrange dual function, the Lagrange dual problem, geometric interpretation, optimality conditions, perturbation and sensitivity analysis, examples, theorems of alternatives.
     

Part 2: Algorithms

  • Unconstrained minimization: unconstrained minimization problems, descent methods, gradient descent method, steepest descent method, Newton’smethod.

  • Equality constrained minimization: equality constrained minimization problems, Newton’s method with equality constraints.

  • Stochastic gradient methods: noisy gradients, incremental gradient method, classification and the perceptron, empirical risk minimization, the randomized Kaczmarz method, convergence analysis.


Intended Learning Outcomes (ILO)

KNOWLEDGE AND UNDERSTANDING

At the end of the course student will be able to...
  • Carry out a formal mathematical proof.
  • Master the theory of convex sets and functions.

  • Formulate convex optimization problems.

  • Master duality theory.

  • Develop and analyze efficient algorithms for solving both unconstrained and constrained optimization problems.

APPLYING KNOWLEDGE AND UNDERSTANDING

At the end of the course student will be able to...
  • Model challenging applications from data science and machine learning as mathematical optimization problems.

  • Interpret the results by performing sensitivity analysis in an applied context.

  • Identify the state of the art of optimization algorithms for different types of optimization problems.

  • Solve convex optimization problems.


Teaching methods

  • Face-to-face lectures
  • Exercises (exercises, database, software etc.)

DETAILS

Some exercises will be solved during the lectures, with the aim of applying the main theoretical results to different problems.


Assessment methods

  Continuous assessment Partial exams General exam
  • Written individual exam (traditional/online)
    x

ATTENDING AND NOT ATTENDING STUDENTS

The written individual exam aims at evaluating:

  • the knowledge of the fundamental mathematical notions and the ability to apply these notions to the solution of simple problems and exercises;
  • the ability to articulate the knowledge of mathematical notions in a conceptually and formally correct way, adequately using definitions, theorems and proofs;
  • the ability to actively search for deductive ideas that are fit to prove possible links between the properties of mathematical objects;
  • the ability to apply mathematical notions to the solution of more complex problems and exercises.

Teaching materials


ATTENDING AND NOT ATTENDING STUDENTS

  • Stephen Boyd and Lieven Vandenberghe, Convex Optimization, Cambridge University Press, 2004. Available online at: http://www.stanford.edu/~boyd/cvxbook/
  • Benjamin Recht and Stephen J. Wright, Optimization for Modern Data Analysis, 2019. Preprint available online at: https://people.eecs.berkeley.edu/~brecht/opt4ml_book/

Last change // :