Friday, June 25, 2021

Friday, June 25, 2021 3:30 PM EDT

Title: From low probability to high confidence in stochastic convex optimization

Speaker: Dmitriy Drusvyatskiy Affliliation: University of Washington Zoom: Contact Emma Watson

Abstract:

Standard results in stochastic convex optimization bound the number of samples that an algorithm needs to generate a point with small function value in expectation. More nuanced high probability guarantees are rare, and typically either rely on “light-tail” noise assumptions or exhibit worse sample complexity. In this work, we show that a wide class of stochastic optimization algorithms can be augmented with high confidence bounds at an overhead cost that is only logarithmic in the confidence level and polylogarithmic in the condition number.

S M T W T F S
30
31
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
1
2
3
  1. 2023 (147)
    1. December (7)
    2. November (17)
    3. October (14)
    4. September (10)
    5. August (7)
    6. July (19)
    7. June (21)
    8. May (12)
    9. April (5)
    10. March (17)
    11. February (10)
    12. January (8)
  2. 2022 (150)
    1. December (8)
    2. November (18)
    3. October (15)
    4. September (11)
    5. August (2)
    6. July (17)
    7. June (17)
    8. May (10)
    9. April (12)
    10. March (18)
    11. February (10)
    12. January (13)
  3. 2021 (103)
    1. December (3)
    2. November (7)
    3. October (6)
    4. September (12)
    5. August (6)
    6. July (10)
    7. June (12)
    8. May (7)
    9. April (9)
    10. March (13)
    11. February (8)
    12. January (10)
  4. 2020 (119)
  5. 2019 (167)
  6. 2018 (136)
  7. 2017 (103)
  8. 2016 (137)
  9. 2015 (136)
  10. 2014 (88)
  11. 2013 (48)
  12. 2012 (39)
  13. 2011 (36)
  14. 2010 (40)
  15. 2009 (40)
  16. 2008 (39)
  17. 2007 (15)