Title:
A
framework
for
applying
subgradient
methods
to
conic
optimization
problems
Speaker: | James Renegar |
Affiliation: | Cornell University |
Room: | 5501 |
Abstract:
A
framework
is
presented
whereby
a
general
convex
conic
optimization
problem
is
transformed
into
an
equivalent
convex
optimization
problem
whose
only
constraints
are
linear
equations
(one
more
equation
than
the
original
problem),
and
whose
objective
function
is
Lipschitz
continuous.
Virtually
any
subgradient
method
can
be
applied
to
solve
the
equivalent
problem.
Two
methods
are
analyzed.
Moreover,
we
show
that
for
a
broad
class
of
optimization
problems
(namely,
hyperbolic
programs),
the
equivalent
problems
can
be
naturally
"smoothed,"
thus
allowing
accelerated
gradient
methods
to
be
applied,
resulting
in
superior
iteration
bounds.
Perhaps
most
surprising
is
that
the
transformation
is
simple
and
so
is
the
basic
theory,
and
yet
the
approach
has
been
overlooked
until
now,
a
blind
spot.