EE236C - Optimization Methods for Large-Scale Systems (Spring 2013-14)

Lecture notes

  1. Gradient method

  2. Quasi-Newton methods

  3. Conjugate gradient method

  4. Subgradients

  5. Subgradient method

  6. Proximal gradient method

  7. Fast proximal gradient methods

  8. Conjugate functions

  9. The proximal mapping

  10. Proximal point method

  11. Dual decomposition

  12. Dual proximal gradient method

  13. Douglas-Rachford splitting and ADMM

  14. Conic optimization

  15. Barrier functions

  16. Path-following methods

  17. Symmetric cones

  18. Primal-dual interior-point methods

Additional lectures (from previous editions of the course)

Homework and project

Homework solutions and grades are posted on the EEweb course website. (Follow the links to “Assignments” or “Grades”.)

Course information

Lectures: Boelter Hall 5420. Monday and Wednesday 10:00 AM-11:50 AM.

Description. The course continues EE236B and covers several advanced and current topics in optimization, with an emphasis on large-scale algorithms for convex optimization. The following subjects will be discussed.

  1. First-order methods for large-scale optimization: gradient and subgradient method, conjugate gradient method, proximal gradient method, accelerated gradient methods.

  2. Decomposition and splittng methods: dual decomposition, augmented Lagrangian method, alternating direction method of multipliers.

  3. Interior-point algorithms for conic optimization.

Textbook and lecture notes. The lecture notes will be posted on this website. The material is largely based on the following books, and on the notes of the course EE364b (Convex Optimization II) at Stanford University.

Course requirements. Several homework assignments and a project.

Grading. Approximate weights in the final grade: homework 20%, project 80%.