Events

Filter by:

Limit to events where the first date of the event:
Date range
Limit to events where the first date of the event:
Limit to events where the title matches:
Limit to events where the type is one or more of:
Limit to events tagged with one or more of:
Limit to events where the audience is one or more of:

Please note: This master’s research paper presentation will be given online.

Shahzaib Ali, Master’s candidate
David R. Cheriton School of Computer Science

Supervisor: Professor Tim Brecht

Wednesday, June 15, 2022 2:00 pm - 3:30 pm EDT (GMT -04:00)

DLS: Tony Chan — A Personal and Historical View of Computational Mathematics

Please note: This DLS presentation will be given in person in DC 1302 and livestreamed over MS Teams.

Tony Chan
President, King Abdullah University of Science and Technology (KAUST)
Professor, Applied Mathematics and Computational Science

In its modern incarnation, computational mathematics is a discipline that blossomed only after WWII. But even in its relatively brief history, there have been some major shifts in its methodology, emphasis, and applications.

Please note: This PhD defence will take place online.

Venkata Abhinav Bommireddi, PhD candidate
David R. Cheriton School of Computer Science

Supervisor: Professor Eric Blais

Convexity plays a prominent role in both mathematics and computer science. It is defined for sets and functions, and many problems related to them can be solved efficiently given the guarantee that the set/function is convex. In this thesis, we focus on three problems related to convexity where we don’t have that guarantee.

Wednesday, June 22, 2022 12:00 pm - 1:00 pm EDT (GMT -04:00)

PhD Seminar • Data Systems | NLP • Backward-compatibility for Neural NLP Models

Please note: This PhD seminar will be given online.

Yuqing Xie, PhD candidate
David R. Cheriton School of Computer Science

Supervisors: Professors Ming Li, Jimmy Lin

I would like to share the work I did during the internship with AWS AI about Backward-Compatibility NLP models. Behavior of deep neural networks can be inconsistent between different versions. Regressions during model update are a common cause of concern that often over-weigh the benefits in accuracy or efficiency gain. 

Please note: This master’s thesis presentation will be given online.

Christian Covington, Master’s candidate
David R. Cheriton School of Computer Science

Supervisors: Professors Xi He, Gautam Kamath

Please note: This PhD seminar will be given online.

Dihong Jiang, PhD candidate
David R. Cheriton School of Computer Science

Supervisor: Professor Yaoliang Yu

Out-of-distribution (OOD) data come from a distribution that is different from training data. Detecting OOD data contributes to secure deployment of machine learning models. Currently, deep generative models have been widely used as an unsupervised approach for OOD detection.