Title: Principled algorithms for finding local minima
Speaker: | Oliver Hinder |
Affiliation: | Stanford University |
Room: | MC 5501 |
Abstract:
Convex optimization is the cornerstone of continuous optimization, but many real problems are nonconvex: neural networks, airplane design, water network management, etc. This two part talk explores my work developing algorithms for finding local minima of nonconvex functions.
The first part of this talk overviews my work with Yair Carmon, John Duchi and Aaron Sidford, which characterizes the oracle complexity of finding stationary points of nonconvex functions in a dimension-free setting.
The second and main part of this talk focuses on an interior point method (IPM) for optimization problems with nonconvex constraints. This is joint work with my advisor Yinyu Ye. We propose a one-phase IPM that, by careful initialization and updates of the slack variables, is guaranteed to find a first-order certificate of local infeasibility, local optimality or unboundedness of the (shifted) feasible region. It has the desirable property of avoiding a typical two-phase or penalty method. Comparison on test sets with a state-of-the-art IPM (IPOPT) indicates similar overall performance but much better infeasibility detection.