EE236C - Optimization Methods for Large-Scale Systems (Spring 2013-14)

Lectures notes

  1. Gradient method

  2. Quasi-Newton methods

  3. Conjugate gradient method

  4. Subgradients

  5. Subgradient method

  6. Proximal gradient method

  7. Fast proximal gradient methods

Spring 2012 lectures notes (to be revised as the course proceeds)

Additional lectures (from previous editions of the course)

Homework and project

Homework solutions and grades are posted on the EEweb course website. (Follow the links to “Assignments” or “Grades”.)

Course information

Lectures: Boelter 5420. Monday and Wednesday 10:00 PM-11:50 AM.

Description. The course continues EE236B and covers several advanced and current topics in optimization, with an emphasis on large-scale algorithms for convex optimization. The following subjects will be discussed.

  1. First-order methods for large-scale optimization: gradient and subgradient method, conjugate gradient method, proximal gradient method, accelerated gradient methods.

  2. Decomposition and splittng methods: dual decomposition, augmented Lagrangian method, alternating direction method of multipliers.

  3. Interior-point algorithms for conic optimization.

Textbook and lecture notes. The lecture notes will be posted on this website. The material is largely based on the following books, and on the notes of the course EE364b (Convex Optimization II) at Stanford University.

Course requirements. Several homework assignments and a project.

Grading. Approximate weights in the final grade: homework 20%, project 80%.