Master's Thesis Defence | Omer Ege Kara, On Enabling Layer-Parallelism for Graph Neural Networks using IMEX Integration

Friday, June 14, 2024 11:00 am - 12:00 pm EDT (GMT -04:00)

MC 5479

Candidate 

Omer Ege Kara | Applied Mathematics, University of Waterloo

Title

On Enabling Layer-Parallelism for Graph Neural Networks using IMEX Integration

 Abstract

Graph Neural Networks (GNNs) are a type of neural networks designed to perform machine
learning tasks with graph data. Recently, there have been several works to train
differential equation-inspired GNN architectures, which are suitable for robust training
when equipped with a relatively large number of layers. Neural networks with more layers
are potentially more expressive. However, the training time increases linearly with the
number of layers. Parallel-in-layer training is a method that was developed to overcome
the increase in training time of deeper networks and was first applied to training residual
networks. In this thesis, we first give an overview of existing works on layer-parallel training
and graph neural networks inspired by differential equations. We then discuss issues
that are encountered when these graph neural network architectures are trained parallelin-
layer and propose solutions to address these issues. Finally, we present and evaluate
experimental results about layer-parallel GNN training using the proposed approach.