Complexity in Convex Optimization: Oracles, Algorithms and Applications
Speaker: | Cristobal Guzman |
---|---|
Affiliation: | Georgia Institute of Technology |
Room: | Mathematics and Computer Building (MC) 5417 |
Abstract:
The
need
to
solve
large-scale
optimization
problems
is
ubiquitous.
Modern
applications
in
signal
processing,
machine
learning
and
big
data,
need
at
their
core
convex
optimization
solvers.
In
this
respect,
understanding
the
limits
of
performance
of
such
methods
is
a
crucial
task.
In
this
talk
we
will
introduce
the
theory
of
oracle
complexity
as
a
standpoint
to
understand
the
efficiency
of
optimization
algorithms.
We
will
see
how
state
of
the
art
algorithms
can
be
seen
as
oracle-based,
and
how
the
classical
theory
does
not
address
current
applications.
Motivated
by
the
latter
issue,
we
extend
the
theory
of
oracle
complexity
in
two
directions.
First,
we
establish
worst-case
lower
bounds
smooth
convex
optimization
over
non-Euclidean
domains.
We
derive
nearly
optimal
lower
bounds
for
smooth
convex
minimization
over
the
\ell^1
ball,
widely
used
in
sparse
signal
processing;
and
the
\ell^{\infty}
ball,
establishing
the
near
optimality
of
the
Frank-Wolfe
method
over
this
domain.
Second,
we
extend
complexity
analysis
to
the
distributional
setting.
We
provide
a
unifying
information-theoretic
approach
to
derive
lower
bounds
for
average
case
oracle
complexity
based
on
a
String-Guessing
Problem,
which
is
combinatorial
in
nature.
Finally,
time
permitting,
I
will
present
future
directions
of
research
in
convex
optimization
and
related
areas.