PhD Seminar • Machine Learning • Locally Optimal Message-Passing on Feature-Decorated Sparse Graphs

Tuesday, September 17, 2024 11:00 am - 12:00 pm EDT (GMT -04:00)

Please note: This PhD seminar will take place online.

Aseem Baranwal, PhD candidate
David R. Cheriton School of Computer Science

Supervisors: Professors Kimon Fountoulakis, Aukosh Jagannath

We study the node classification problem on feature-decorated graphs in the sparse setting, i.e., when the expected degree of a node is O(1) in the number of nodes, in the fixed-dimensional asymptotic regime, i.e., the dimension of the feature data is fixed while the number of nodes is large. Such graphs are typically known to be locally tree-like.

We introduce a notion of Bayes optimality for node classification tasks, called asymptotic local Bayes optimality, and compute the optimal classifier according to this criterion for a fairly general statistical data model with arbitrary distributions of the node features and edge connectivity. The optimal classifier is implementable using a message-passing graph neural network architecture. We then compute the generalization error of this classifier and compare its performance against existing learning methods theoretically on a well-studied statistical model with naturally identifiable signal-to-noise ratios (SNRs) in the data. We find that the optimal message-passing architecture interpolates between a standard MLP in the regime of low graph signal and a typical convolution in the regime of high graph signal. Furthermore, we prove a corresponding non-asymptotic result.

Based on the paper: Baranwal, A., Fountoulakis, K., & Jagannath, A. (2023). Optimality of message-passing architectures for sparse graphs. Advances in Neural Information Processing Systems (NeurIPS), 36, 40320-40341.


Attend this PhD seminar on Zoom.