CS 201 | Arash Amini, UCLA Statistics

Polynomial Graph Neural Networks: Theoretical Limits and Graph Noise Impact

Abstract:
This talk examines the theoretical foundations of Graph Neural Networks (GNNs), focusing on polynomial GNNs (Poly-GNNs). We start with empirical evidence challenging the need for complex GNN architectures in semi-supervised node classification, showing simpler methods often perform comparably.
We then analyze Poly-GNNs within a contextual stochastic block model, addressing a key question: Does increasing GNN depth improve class separation in node representations?
Our results show that for large graphs, the rate of class separation remains constant regardless of network depth. We demonstrate how “graph noise” can overpower other signals in deeper networks, negating the benefits of additional feature aggregation. The analysis also reveals differences in noise propagation between even and odd-layered GNNs, providing insights for network architecture design and highlighting trade-offs between model complexity and performance in graph-based machine learning.

Bio:
Arash A. Amini is an Associate Professor of Statistics and Data Science at the University of California, Los Angeles. He received his Ph.D. in electrical engineering from the University of California, Berkeley in 2011, and completed a postdoctoral fellowship at the University of Michigan. His research spans high-dimensional statistics, functional and nonparametric estimation, network data analysis, optimization, and graphical models, with recent work shedding light on the performance limits of graph neural networks. At UCLA, he teaches courses from introductory machine learning to advanced theoretical statistics, mentoring the next generation of statisticians and data scientists.

For more details, please visit his UCLA homepage at http://www.stat.ucla.edu/~arashamini/

Date/Time:
Date(s) - Feb 13, 2025
4:00 pm - 5:45 pm

Location:
3400 Boelter Hall
420 Westwood Plaza Los Angeles California 90095