Structured smoothness in modern convex optimization
Speaker: | Quoc Tran-Dinh |
---|---|
Affiliation: | EPFL, Switzerland |
Room: | Mathematics and Computer Building (MC) 5417 |
Abstract:
The
importance
of
convex
optimization
techniques
has
dramatically
increased
in
the
last
decade
due
to
the
rise
of
new
theory
for
structured
sparsity
and
low-rankness,
and
successful
statistical
learning
models
such
as
support
vector
machines.
Convex
optimization
formulations
are
now
employed
with
great
success
in
various
subfields
of
data
sciences,
including
machine
learning,
compressive
sensing,
medical
imaging,
geophysics,
and
bioinformatics.
However,
the
renewed
popularity
of
convex
optimization
places
convex
algorithms
under
tremendous
pressure
to
accommodate
increasingly
difficult
nonlinear
models
and
nonsmooth
cost
functions
with
ever
increasing
data
sizes.
Overcoming
these
emerging
challenges
requires
nonconventional
ways
of
exploiting
useful
yet
hidden
structures
within
the
underlying
convex
optimization
models.
To
this
end,
I
will
demonstrate
how
to
exploit
the
classical
notion
of
smoothness
in
novel
ways
to
develop
fully
rigorous
methods
for
fundamental
convex
optimization
settings,
from
primal-dual
framework
to
composite
convex
minimization,
and
from
proximal-path
following
scheme
to
barrier
smoothing
technique.
Some
of
these
results
play
key
roles
in
convex
optimization,
such
as
unification
and
uncertainty
principles
for
augmented
Lagrangian
and
decomposition
methods,
and
have
important
computational
implications
such
as
solving
convex
programs
on
the
positive
semidefinite
cone
without
any
matrix
decompositions.