Welcome to the Department of Statistics and Actuarial Science

The Department of Statistics and Actuarial Science is among the top academic units for statistical and actuarial science in the world and is home to more than 40 research active full-time faculty working in diverse and exciting areas. The Department is also home to over 900 undergraduate students and about 150 graduate students in programs including Actuarial Science, Biostatistics, Quantitative Finance, Statistics, and Statistics-Computing.

We are located on University of Waterloo main campus, which is located at the heart of Canada's Technology Triangle about 100 kilometers west of Toronto.

  1. May 31, 2019Master of Actuarial Science (MACTSC) 10th AnniversaryMACTSC 10th Anniversary Banner

    In 2019, the Master of Actuarial Science (MActSc) professional degree program will be celebrating 10 wonderful years at the University of Waterloo. 

    MActSc is an internationally renowned program in actuarial science and risk management, and is located within the Department of Statistics and Actuarial Science. This fast track professional program is only offered to the best and brightest students from around the world. Once accepted these students receive one-on-one interpersonal training from prominent faculty in the field of actuarial science. After 10 rigorous and demanding years, staying on the cutting edge of the industry and training the most elite in this field, the MActSc program will be celebrating by hosting a banquet dinner on May 31, 2019. 

    This event will be great opportunity for past and current students, faculty, and industry supporters to celebrate their hard work over the past decade.     

  2. Nov. 23, 2018Inspiring Student - Tianyi LiuTianyi Liu

    MQF is such a great program. It’s a good combination of studying, researching, and finding a job – doing an internship.

    Tianyi Liu, Master of Quantitative Finance student

    A graduate of Waterloo’s undergraduate mathematical finance program, Tianyi was a natural fit for the Master of Quantitative Finance program. He knew many of his professors, and appreciated the computing facilities and the support of his department.

  3. Nov. 19, 2018The Vector Institute recognizes MMath Data Science for Artificial Intelligence Scholarship Vector Institute logo
    Source: Vector Institute for Artificial Intelligence

    There are currently eight programs recognized by the Vector Institute for Artificial Intelligence for ensuring that master’s graduates are equipped with the knowledge, skills and competencies sought by industry.  The Department of Statistics and Actuarial Science's Master of Mathematics (MMath) Data Science Specialization is one of the few recognized programs. 

    The Vector Institute (Vector) has been tasked with supporting Ontario’s growing AI eco-system by aiding Ontario’s AI companies and labs source talent. This will be achieved through a combination of three primary activities: talent placement via internships; networking and AI community building; and, applied AI education.

    A primary goal associated with applied AI education is to increase enrolment and the number of graduates from AI-related master’s programs. To support universities in achieving this goal, Vector has introduced the Vector Scholarships in Artificial Intelligence (VSAI) to recognize top students enrolled in AI-related master’s programs. 

    The VSAI are expected to enhance recruitment of top tier students, increase the number of applicants to recognized AI programs, increase access to advanced studies in AI, and build community among scholars.

Read all news
  1. Jan. 21, 2019Department seminar by Fangda Liu, Georgia State University

    Impact of preferences on optimal insurance in the presence of multiple policyholders

    In the optimal insurance literature, one typically studies optimal risk sharing between one insurer (or reinsurer) and one policyholder. However, the insurance business is based on diversification benefits that arise when pooling many insurance policies. In this paper, we first show that results on optimal insurance that are valid in the case of a single policyholder extend to the case of multiple policyholders, provided their insurance claims are independent. However, due to natural catastrophes, increasing life expectancy and terrorism events, insurance claims show tendency to be correlated. Interestingly, in the case of interdependent insurance policies, it may become optimal for the insurer to refuse selling insurance to some prospects, based on their attitude towards risk or due to their risk exposure characteristics. This finding calls for government policies to ensure that insurance stays available and affordable to everyone.

  2. Jan. 22, 2019Department seminar by Alex Shestopaloff, The Alan Turing Institute

    Strategies for scaling iterated conditional Sequential Monte Carlo methods for high dimensional state space models

    The iterated Conditional Sequential Monte Carlo (cSMC) method is a particle MCMC method commonly used for state inference in non-linear, non-Gaussian state space models. Standard implementations of iterated cSMC provide an efficient way to sample state sequences in low-dimensional state space models. However, efficiently scaling iterated cSMC methods to perform well in models with a high-dimensional state remains a challenge. One reason for this is the use of a global proposal, without reference to the current state sequence in the MCMC run. In high dimensions, such a proposal will typically not be well-matched to the posterior and impede efficient sampling. I will describe a technique based on the embedded HMM (Hidden Markov Model) framework to construct efficient proposals in high dimensions that are local relative to the current state sequence. A second obstacle to scalability of iterated cSMC is not using the entire observed sequence to construct the proposal. Typical implementations of iterated cSMC use a proposal at time t that that relies only on data up to time t. In high dimensions and in the presence of informative data, such proposals become inefficient, and can considerably slow down sampling. I will introduce a principled approach to incorporating future observations in the cSMC proposal at time t. By considering several examples, I will demonstrate that both strategies improve the performance of iterated cSMC for sequence sampling in high-dimensional state space models.

  3. Jan. 24, 2019Department seminar by Minsuk Shin, Harvard University

    Some Priors for Nonparametric Shrinkage and Bayesian Sparsity Inference 

    In this talk, I introduce two novel classes of shrinkage priors for different purposes: functional HorseShoe (fHS) prior for nonparametric subspace shrinkage and neuronized priors for general sparsity inference. 

      In function estimation problems, the fHS prior encourages shrinkage towards parametric classes of functions. Unlike other shrinkage priors for parametric models, the fHS shrinkage acts on the shape of the function rather than inducing sparsity on model parameters. I study some desirable theoretical properties including an optimal posterior concentration property on the function and the model selection consistency. I apply the fHS prior to nonparametric additive models for some simulated and real data sets, and the results show that the proposed procedure outperforms the state-of-the-art methods in terms of estimation and model selection.

      For general sparsity inference, I propose the neuronized priors to unify and extend existing shrinkage priors such as one-group continuous shrinkage priors, continuous spike-and-slab priors, and discrete spike-and-slab priors with point-mass mixtures. The new priors are formulated as the product of a weight variable and a transformed scale variable via an activation function.  By altering the activation function, practitioners can easily implement a large class of Bayesian variable selection procedures. Compared with classic spike and slab priors, the neuronized priors achieve the same explicit variable selection without employing any latent indicator variable, which results in more efficient MCMC algorithms and more effective posterior modal estimates. I also show that these new formulations can be applied to more general and complex sparsity inference problems, which are computationally challenging, such as structured sparsity and spatially correlated sparsity problems.

All upcoming events

Faculty Joint Publications

Map of Faculty and PhD Students backgrounds

Meet our people

Jock MacKay

Jock MacKay

Adjunct Professor

Contact Information:
Jock MacKay

​Research interests

My research interests span a variety of areas in the application of statistical methods to the improvement of manufacturing processes, including experimental design and observational methods. In collaboration with my colleague Stefan Steiner, we have developed these methods into a system for reducing variation in process outputs. This work led to our book, Statistical Engineering.