University COVID-19 update

The University of Waterloo is constantly updating our most Frequently Asked Questions.

Questions about buildings and services? Visit the list of Modified Services.

Please note: The University of Waterloo is closed for all events until further notice.

Events - March 2020

Thursday, March 12, 2020 — 4:00 PM EDT

 Bayesian Additive Regression Trees for Statistical Learning

Regression trees are flexible non-parametric models that are well suited to many modern statistical learning problems. Many such tree models have been proposed, from the simple single-tree model (e.g. Classification and Regression Trees — CART) to more complex tree ensembles (e.g. Random Forests). Their nonparametric formulation allows one to model datasets exhibiting complex non-linear relationships between predictors and the response.  A recent innovation in the statistical literature is the development of a Bayesian analogue to these classical regression tree models.  The benefit of the Bayesian approach is the ability to quantify uncertainties within a holistic Bayesian framework.  We introduce the most popular variant, the Bayesian Additive Regression Trees (BART) model, and describe recent innovations to this framework.  We conclude with some of the exciting research directions currently being explored.

Thursday, March 12, 2020 — 2:45 PM EDT

Please note: This seminar has been cancelled.

Friday, March 6, 2020 — 10:30 AM EST

Please note: This seminar has been cancelled.

Thursday, March 5, 2020 — 4:00 PM EST

Concentration of Maxima: Fundamental Limits of Exact Support Recovery in High Dimensions

We study the estimation of the support (set of non-zero components) of a sparse high-dimensional signal observed with additive and dependent noise. With the usual parameterization of the size of the support set and the signal magnitude, we characterize a phase-transition phenomenon akin to the Ingster’s signal detection boundary.  We show that when the signal is above the so-called strong classification boundary, thresholding estimators achieve asymptotically perfect support recovery. This is so under arbitrary error dependence assumptions, provided that the marginal error distribution has rapidly varying tails.  Conversely, under mild dependence conditions on the noise, we show that no thresholding estimators can achieve perfect support recovery if the signal is below the boundary.  For log-concave error densities, the thresholding estimators are shown to be optimal and hence the strong classification boundary is universal, in this setting.

The proofs exploit a concentration of maxima phenomenon, known as relative stability. We obtain a complete characterization of the relative stability phenomenon for dependent Gaussian noise via Slepian, Sudakov-Fernique bounds and some Ramsey theory.

S M T W T F S
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
1
2
3
4
  1. 2020 (24)
    1. June (1)
    2. May (1)
    3. March (4)
    4. February (4)
    5. January (14)
  2. 2019 (65)
    1. December (3)
    2. November (8)
    3. October (8)
    4. September (4)
    5. August (2)
    6. July (2)
    7. June (2)
    8. May (6)
    9. April (7)
    10. March (6)
    11. February (4)
    12. January (13)
  3. 2018 (44)
  4. 2017 (55)
  5. 2016 (44)
  6. 2015 (38)
  7. 2014 (44)
  8. 2013 (46)
  9. 2012 (44)