AMS 230 – Numerical Optimization. This graduate course provides an introduction to a variety of widely used numerical algorithms for solving optimization problems. The course focuses on the derivation of numerical methods, mathematical performance analysis, and practical implementations of the computational algorithms for continuous optimization problems.  

Instructor: Qi Gong, Associate Professor, BE 361A, email: qigong at soe.ucsc.edu

Lectures: MoWeFr 10:40AM - 11:45AM, BE 165

Webcast: https://webcast.ucsc.edu

Office Hours: Wednesday, 1:00PM to 3:00PM, BE 361A

Textbook: "Numerical Optimization" by Jorge Nocedal and Stephen J. Wright, Springer, Second Edition, 2006

References: 

"Nonlinear Programming” by Dimitri Bertsekas, Athena Scientific, 2nd edition, 1999
"Iterative Methods for Optimization” by C. T. Kelly, SIAM, 1999
"Convex Optimization” by Stephen Boyd and Lieven Vandenberghe, Cambridge University Press, 2004
"Numerical Optimization: Theoretical and Practical Aspects” by J. Frederic Bonnans, Jean Charles Gilbert, Claude Lemarechal, Claudia A. Sagastizbal, Springer, 2006

Grading: Homework: 100%

Tentative Schedule:  

  • Lecture 1-3: Introduction to numerical optimization and mathematical preliminaries. (Chapter 1 & 2 of Nocedal and Wright) Introduction: classification of optimization problems, application examples, and basic numerical strategies for unconstrained optimization. Mathematical preliminaries include: necessary/sufficient optimality conditions, convex functions/sets, sequence, rate of convergence, and descent directions.

  • Lecture 4-8: Line search methods (Chapter 3 of Nocedal and Wright). Wolfe conditions and step-length selection algorithms; convergence of line search methods, steepest descent method, Newton’s method, and Newton’s method with Hessian modification.

  • Lecture 9-12: Linear conjugate gradient method and its convergence properties; conjugate gradient method for nonlinear problems (Chapter 5 of Nocedal and Wright).

  • Lecture 13-16: Trust-region methods for unconstrained optimization (Chapter 4 of Nocedal and Wright).

  • Lecture 17-20: Quasi-Newton methods: BFGS method, limited memory BFGS, symmetric-rank-1 method, convergence analysis of quasi-Newton methods (Chapter 6 of Nocedal and Wright).

  • Lecture 21-23: Least square problems (Chapter 10 of Nocedal and Wright), and numerical algorithms for nonlinear equations (Chapter 11 of Nocedal and Wright).

  • Lecture 24-26: Fundamental theory for constrained optimization. Constraint qualification; Karush-Kuhn-Tucker first-order optimality conditions; Lagrange multipliers and sensitivity (Chapter 12 of Nocedal and Wright).

  • Lecture 27-30: Selected topics on constrained nonlinear programming, e.g., penalty methods, augmented Lagrangian methods, sequential quadratic programming.


Disability Resource Center: UC Santa Cruz is committed to creating an academic environment that supports its diverse student body. If you are a student with a disability who requires accommodations to achieve equal access in this course, please submit your Accommodation Authorization Letter from the Disability Resource Center (DRC) to me privately during my office hours or by appointment, preferably within the first two weeks of the quarter. At this time, I would also like us to discuss ways we can ensure your full participation in the course. I encourage all students who may benefit from learning more about DRC services to contact DRC by phone at 831-459-2089, or by email at drc@ucsc.edu.