CS 201 | Boris Hanin, Princeton University

Scaling Limits of Neural Networks

Abstract:
Neural networks are often studied analytically through scaling limits: regimes in which taking to infinity structural network parameters such as depth, width, and number of training datapoints results in simplified models of learning. I will survey several such approaches with the goal of illustrating the rich and still not fully understood space of possible behaviors when some or all of the network’s structural parameters are large.

Bio:
Boris Hanin is an Assistant Professor in Princeton ORFE working on mathematical physics and theoretical machine learning. He is the recipient of an NSF CAREER Award (2022) and an Alfred P. Sloan Fellowship in Mathematics (2024).

Date/Time:
Date(s) - Oct 17, 2024
4:00 pm - 5:45 pm

Location:
3400 Boelter Hall
420 Westwood Plaza Los Angeles California 90095