ABSTRACT: The success of deep learning is due, to a large extent, to the remarkable effectiveness of gradient-based optimization methods applied to large neural networks. [...]
ABSTRACT: Knowledge distillation introduced in the deep learning context is a method to transfer knowledge from one architecture to another. In particular, when the architectures [...]
ABSTRACT: Recent investigations into deep neural networks that are infinitely wide have given rise to intriguing connections with kernel methods. Specifically, it was found that [...]
ABSTRACT: As machine learning is increasingly deployed, there is a need for reliable and robust methods that go beyond simple test accuracy. In this talk, [...]
ABSTRACT: As Machine Learning systems are increasingly becoming part of user-facing applications, their reliability and robustness are key to building and maintaining trust with users. [...]