Events

September 2018

Sun Mon Tue Wed Thu Fri Sat
26
27
28
29
30
31
1
 
 
 
 
 
 
 
2
3
4
5
6
7
8
 
 
 
 
 
 
9
10
11
12
13
14
15
 
 
 
 
 
 
16
17
18
19
20
21
22
 
 
 
 
 
 
 
23
24
25
26
27
28
29
 
 
 
 
 
 
30
1
2
3
4
5
6
 
 
 
 
 
 
 
Tuesday, September 4, 2018 — 12:30 to 12:30 PM EDT

Ricardo Salmon, PhD candidate
David R. Cheriton School of Computer Science

Stochastic satisfiability (SSAT), Quantified Boolean Satisfiability (QBF) and decision theoretic planning in infinite horizon partially observable Markov decision processes (POMDPs) are all PSPACE-Complete problems. Since they are all complete for the same complexity class, I show how to convert them into one another in polynomial time and space.

Thursday, September 13, 2018 — 1:30 PM EDT

Irish Medina, Master’s candidate
David R. Cheriton School of Computer Science

Smart water meters have been installed across Abbotsford, British Columbia, Canada, to measure the water consumption of households in the area. Using this water consumption data, we develop machine learning and deep learning models to predict daily water consumption for existing multi-family residences. We also present a new methodology for predicting the water consumption of new housing developments. 

Tuesday, September 25, 2018 — 1:00 PM EDT

Daniel Recoskie, PhD candidate
David R. Cheriton School of Computer Science

The wavelet transform is a well-studied and understood analysis technique used in signal processing. In wavelet analysis, signals are represented by a sum of self-similar wavelet and scaling functions. Typically, the wavelet transform makes use of a fixed set of wavelet functions that are analytically derived. We propose a method for learning wavelet functions directly from data. We impose an orthogonality constraint on the functions so that the learned wavelets can be used to perform both analysis and synthesis. We accomplish this by using gradient descent and leveraging existing automatic differentiation frameworks. Our learned wavelets are able to capture the structure of the data by exploiting sparsity. We show that the learned wavelets have similar structure to traditional wavelets.

S M T W T F S
26
27
28
29
30
31
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
1
2
3
4
5
6
  1. 2018 (25)
    1. September (3)
    2. August (3)
    3. July (1)
    4. June (5)
    5. May (4)
    6. April (3)
    7. March (3)
    8. February (1)
    9. January (2)
  2. 2017 (19)
    1. December (1)
    2. November (2)
    3. October (1)
    4. September (1)
    5. August (3)
    6. July (4)
    7. June (3)
    8. May (2)
    9. April (1)
    10. February (1)
  3. 2015 (4)
  4. 2012 (4)
  5. 2011 (27)
  6. 2010 (12)
  7. 2009 (18)
  8. 2008 (15)
  9. 2007 (24)
  10. 2006 (36)
  11. 2005 (13)