Please note: This PhD defence will take place online.
Aseem Baranwal, PhD candidate
David R. Cheriton School of Computer Science
Supervisors: Professors Kimon Fountoulakis, Aukosh Jagannath
Graph Neural Networks (GNNs) are one of the most popular architectures used to solve classification problems on data where entities have attribute information accompanied by relational information. Among them, Graph Convolutional Networks (GCN) and Graph Attention Networks (GAT) are two of the most popular GNN architectures.
In this thesis, I present a statistical framework for understanding node classification on feature-rich relational data. First, I use the framework to study the generalization error and the effects of existing neural network architectures, namely, graph convolutions and graph attention on the Contextual Stochastic Block Model (CSBM) in the regime where the average degree of a node is at least logarithmic in the number of nodes. Second, I propose a notion of asymptotic local optimality for node classification tasks and design a GNN architecture that is provably optimal in this notion, for the sparse regime, i.e., average degree O(1).