ECE236C - Optimization Methods for Large-Scale Systems

Course material for Spring 2020 will be posted at the CCLE course website.

Lecture notes (Spring 2019)

  1. Introduction
    Gradient method

  2. Subgradients

  3. Subgradient method

  4. Proximal gradient method

  5. Conjugate functions

  6. The proximal mapping

  7. Accelerated proximal gradient methods

  8. Proximal point method

  9. Dual decomposition

  10. Dual proximal gradient method

  11. Douglas-Rachford splitting and ADMM

  12. Primal-dual proximal methods

  13. Generalized distances and mirror descent

  14. Generalized proximal gradient method

  15. Conjugate gradient method

  16. Newton's method

  17. Quasi-Newton methods

  18. Gauss-Newton method

Lectures from previous years

Conic optimization and interior-point methods

First-order methods

Localization and cutting-plane methods

Course information

Description. The course continues ECE236B and covers several advanced and current topics in optimization, with an emphasis on large-scale algorithms for convex optimization. This includes first-order methods for large-scale optimization (gradient and subgradient method, conjugate gradient method, proximal gradient method, accelerated gradient methods), decomposition and splitting methods (dual decomposition, augmented Lagrangian method, alternating direction method of multipliers, monotone operators and operator splitting), and (possibly) interior-point algorithms for conic optimization.

Lecture notes. The lecture notes will be posted on this website. Many of the topics are covered in the following books and in the course EE364b (Convex Optimization II) at Stanford University.

Course requirements. Weekly homework assignments and a project.

Grading. Approximate weights in the final grade: homework 30%, project 70%.