Monday, September 9, 2019 — 4:00 PM EDT

Shai Ben-David, School of Computer Science, University of Waterloo

"A basic machine learning problem is independent of set theory"

The mathematical foundations of machine learning play a key role in the development of the field. They improve our understanding and provide tools for designing new learning paradigms. The advantages of mathematical analysis, however, sometimes come with a cost. Gödel and Cohen showed, in a nutshell, that not everything is provable. Here we show that machine learning shares this fate. We describe simple scenarios where learnability cannot be proved nor refuted using the standard axioms of mathematics. Our proof is based on the fact the continuum hypothesis cannot be proved nor refuted. We show that, in some cases, a solution to the ‘estimating the maximum expectation’ problem is equivalent to the continuum hypothesis.

As a corollary, we show that for some basic notion of statistical learning there can be no combinatorial dimension that characterizes learnability (in a way similar to the fundamental characterization of PAC learnability by VC-dimension).

The talk is based on joint work with Pavel Hrubeˇs, Shay Moran, Amir Shpilka, and Amir Yehudayoff.

MC 5501

S M T W T F S
31
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
1
2
3
  1. 2022 (98)
    1. August (8)
    2. July (15)
    3. June (14)
    4. May (13)
    5. April (14)
    6. March (15)
    7. February (12)
    8. January (7)
  2. 2021 (135)
    1. December (11)
    2. November (22)
    3. October (15)
    4. September (5)
    5. August (15)
    6. July (17)
    7. June (15)
    8. May (1)
    9. April (4)
    10. March (11)
    11. February (9)
    12. January (10)
  3. 2020 (103)
  4. 2019 (199)
  5. 2018 (212)
  6. 2017 (281)
  7. 2016 (335)
  8. 2015 (211)
  9. 2014 (235)
  10. 2013 (251)
  11. 2012 (135)