Friday, November 29, 2019 — 10:00 AM EST

Jaejun Lee, Master’s candidate
David R. Cheriton School of Computer Science

Used for simple voice commands and wake-word detection, keyword spotting (KWS) is the task of detecting pre-determined keywords in a stream of utterances. A common implementation of KWS involves transmitting audio samples over the network and detecting target keywords in the cloud with neural networks because on-device application development presents compatibility issues with various edge devices and provides limited supports for deep learning. Unfortunately, such an architecture can lead to unpleasant user experience because network latency is not deterministic. Furthermore, the client-server architecture raises privacy concerns because users lose control over the audio data once it leaves the edge device. 

Friday, December 6, 2019 — 2:00 PM EST

Priyank Jaini, PhD candidate
David R. Cheriton School of Computer Science

Multivariate density estimation is a central problem in unsupervised machine learning that has been studied immensely in both statistics and machine learning. Several methods have thus been proposed for density estimation including classical techniques like histograms, kernel density estimation methods, mixture models, and more recently neural density estimation that leverages the recent advances in deep learning and neural networks to tractably represent a density function. In today's age when large amounts of data are being generated in almost every field it is of paramount importance to develop density estimation methods that are cheap both computationally and in memory cost. The main contribution of this thesis is in providing a principled study of parametric density estimation methods using mixture models and triangular maps for neural density estimation. 

S M T W T F S
27
28
29
30
31
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
  1. 2019 (35)
    1. December (1)
    2. November (2)
    3. October (1)
    4. September (5)
    5. August (5)
    6. July (3)
    7. June (2)
    8. May (4)
    9. April (4)
    10. March (1)
    11. February (5)
    12. January (2)
  2. 2018 (30)
    1. December (2)
    2. October (2)
    3. September (4)
    4. August (3)
    5. July (1)
    6. June (5)
    7. May (4)
    8. April (3)
    9. March (3)
    10. February (1)
    11. January (2)
  3. 2017 (19)
  4. 2015 (4)
  5. 2012 (4)
  6. 2011 (27)
  7. 2010 (12)
  8. 2009 (18)
  9. 2008 (15)
  10. 2007 (24)
  11. 2006 (36)
  12. 2005 (13)