Accelerated proximal gradient methods (updated)
Spring 2014 notes
Additional lectures (from previous editions of the course)
Homework solutions and grades are posted on the EEweb course website. (Follow the links to “Assignments” or “Grades”.)
Lectures: Kinsey 1200B. Tuesday and Thursday, 12:00PM-1:50PM.
Description. The course continues EE236B and covers several advanced and current topics in optimization, with an emphasis on large-scale algorithms for convex optimization. This includes first-order methods for large-scale optimization (gradient and subgradient method, conjugate gradient method, proximal gradient method, accelerated gradient methods), decomposition and splitting methods (dual decomposition, augmented Lagrangian method, alternating direction method of multipliers, monotone operators and operator splitting), and (possibly) interior-point algorithms for conic optimization.
Textbook and lecture notes. The lecture notes will be posted on this website. Many of the topics are covered in the following books and in the course EE364b (Convex Optimization II) at Stanford University.
D. Bertsekas, Convex Optimization Algorithms, Athena Scientific.
D. Bertsekas and J. Tsitsiklis, Parallel and Distributed Computation, Athena Scientific.
Y. Nesterov, Introductory Lectures on Convex Optimization: A Basic Course, Kluwer.
J. Nocedal and S. Wright, Numerical Optimization (Springer).
B. T. Polyak, Introduction to Optimization, Optimization Software.
Course requirements. Several homework assignments and a project.
Grading. Approximate weights in the final grade: homework 20%, project 80%.