These course will be online. Details on the platform and schedule will be published soon.
22 hours (8 lectures and 3 labs)
20-24 July 2020
Convex optimization plays a key role in data sciences. The objective of this course is to provide basic tools and methods at the core of modern nonlinear convex optimization. Starting from the gradient descent method we will cover some state of the art algorithms, including proximal gradient methods, dual algorithms, stochastic gradient descent, and randomized block-coordinate descent methods.
Fill this Form to apply to the Introduction to Convex Optimization course
Motivation from applications. Basic concepts: convex sets and functions.
General convergence principles and the gradient descent algorithm
Lab - Gradient descent
The gradient descent method in action
Non smooth differential theory
Duality theory part I
The proximal gradient method
Lab - Sparsity
Solving problems with sparsity constraints
Proximity operators of spectral functions. Duality theory part II
Dual algorithms and applications
Stochastic gradient descent and randomized coordinate descent
Lab - Matrix completion
Matrix completion with nuclear norm regularization and Group Lasso.