Tutte seminar - Yurii Nesterov

Friday, December 7, 2007 3:30 pm - 4:30 pm EST (GMT -05:00)

Gradient Methods for Minimizing Composite Objective Function

Speaker: Yurii Nesterov
Affiliation: Catholic University of Louvain
Room: Mathematics & Computer Building (MC) 5158

Abstract:

In this talk we present several methods for solving optimization problems with the objective function formed as a sum of two convex terms: one is smooth and given by a black-box oracle, and another is general but simple and its structure is known. It appears that, despite to the bad properties of the sum, such problems can be solved with efficiency typical for the good part of the objective. For these problems, we consider primal and dual variants of the gradient method (converges as $O\left({1 \over k}\right)$), and an accelerated multistep version, which converges as $O\left({1 \over k2}\right)$, where $k$ is the iteration counter. For all methods, we present very efficient "line search" procedures and show that the additional computational work necessary for estimating the unknown problem class parameters can only double the complexity of each iteration.