Home Logic and asymptotic combinatorics of Graph Neural Networks
Post
Cancel

Logic and asymptotic combinatorics of Graph Neural Networks

Graph neural networks (GNNs) are the predominant architectures for a variety of learning tasks on graphs. We present a new angle on the expressive power of GNNs by studying how the predictions of a GNN probabilistic classifier evolve as we apply the classifier on larger graphs drawn from some random graph model. We show that the output converges asymptotically almost surely to a constant function, which upper-bounds what these classifiers can express uniformly.

Our convergence results are framed within a query language with aggregates, subsuming a very wide class of GNNs, including state of the art models, with aggregates including mean and the attention-based mechanism of graph transformers. The results apply to a broad class of random graph models, but in the talk we will focus on Erdős-Rényi model and the stochastic block model. The query language-based approach allows our results to be situated within the long line of research on convergence laws for logic.

The talk will include joint work with Sam Adam-Day, Ismail Ceylan, and Ben Finkeshtein – see https://arxiv.org/abs/2403.03880, and also joint work with Sam Adam-Day and Alberto Larrauri.

This post is licensed under CC BY 4.0 by the author.