epfml / OptML_course

EPFL Course - Optimization for Machine Learning - CS-439
1.15k stars 314 forks source link

EPFL Course - Optimization for Machine Learning - CS-439

Official coursebook information

Lectures: Fri 13:15-15:00 in CO2

Exercises: Fri 15:15-17:00 in BC01

This course teaches an overview of modern mathematical optimization methods, for applications in machine learning and data science. In particular, scalability of algorithms to large datasets will be discussed in theory and in implementation.

Team

Contents:

Convexity, Gradient Methods, Proximal algorithms, Subgradient Methods, Stochastic and Online Variants of mentioned methods, Coordinate Descent, Frank-Wolfe, Accelerated Methods, Primal-Dual context and certificates, Lagrange and Fenchel Duality, Second-Order Methods including Quasi-Newton Methods, Derivative-Free Optimization.

Advanced Contents:

Parallel and Distributed Optimization Algorithms, Federated Learning

Computational Trade-Offs (Time vs Data vs Accuracy), Lower Bounds

Non-Convex Optimization: Convergence to Critical Points, Alternating minimization, Neural network training

Program:

Nr Date Topic Materials Exercises
1 23.2. Introduction, Convexity notes, slides lab00
2 1.3. Gradient Descent notes, slides lab01
3 8.3. Projected Gradient Descent notes, slides lab02
4 15.3. Proximal and Subgradient Descent notes, slides lab03
5 22.3. Stochastic Gradient Descent, Non-Convex Optimization notes, slides lab04
. 29.3. easter vacation -
. 5.4. easter vacation -
6 12.4. Non-Convex Optimization notes, slides lab05
7 19.4. Newton's Method & Quasi-Newton notes, slides lab06
8 26.4. Coordinate Descent notes, slides lab07
9 3.5. Frank-Wolfe notes, slides lab08
10 10.5. Mini-Project week -
11 17.5. Accelerated Gradient, Gradient-free, adaptive methods notes, slides lab09
12 24.5. Opt for ML in Practice notes, slides lab10
13 31.5. Opt for ML in Practice notes, slides Q&A Projects

Videos:

Exercises:

The weekly exercises consist of a mix of theoretical and practical Python exercises for the corresponding topic each week (starting week 2). Solutions to exercises are available in the lab folder.

Discussion forum (EPFL internal)

Project:

A mini-project will focus on the practical implementation: Here we encourage students to investigate the real-world performance of one of the studied optimization algorithms or variants, helping to provide solid empirical evidence for some behaviour aspects on a real machine-learning task. The project is mandatory and done in groups of 3 students. It will count 30% to the final grade. Project reports (3 page PDF) are due June 14th. Here is a detailed project description.

Assessment:

Session Exam on Monday 01.07.2024 from 15h15 to 18h15 (INM200, INM202). Format: Closed book. Theoretical questions similar to exercises. You are allowed to bring one cheat sheet (A4 size paper, both sides can be used).

For practice:

Links to related courses and materials

Recommended Books