CS 201: Climbing Out The Deep End, YORAM SINGER, Google

Speaker: Yoram Singer
Affiliation: Google

SINGER-PIC

ABSTRACT: Deep learning refers the task of training multilayer artificial neural networks. Despite the practical success of deep learning, existing analyses of their sample complexity, algorithmic stability, and generalization is rather shallow. In the talk I will review three interleaved lines of research: Stability of stochastic gradient methods for training neural networks. Sketching algorithms and neural networks. Duality between neural networks and reproducing kernel Hilbert spaces. The above research provides first steps towards understanding current successes of deep architectures and serves as a stepping stone towards new non-linear architectures and learning algorithms. We present experimental results which provide empirical validation of the formal analyses. Join work with Amit Daniely, Roy Frostig, Vineet Gupta, Moritz Hardt, Nevena Lazic, Ben Recht, and Kunal Talwar. BIO: Yoram Singer is a research scientist at Google. He heads a small research group, which focuses on foundations of machine learning. Before joining Google, he was a professor of Computer Science at the Hebrew University of Jerusalem.

Hosted by Professor Fei Sha

REFRESHMENTS at 3:45 pm, SPEAKER at 4:15 pm

Date/Time:
Date(s) - May 05, 2016
4:15 pm - 5:45 pm

Location:
3400 Boelter Hall
420 Westwood Plaza Los Angeles California 90095