Speaker: Krishnakumar Balasubramanian
Affiliation: UC Davis
The task of sampling from a given density is a fundamental computational task with numerous applications in machine learning and statistics. In the last decade, the iteration complexity of sampling from a smooth and (strongly) log-concave density has been well-studied.However, handling more complex multi-model and heavy-tailed densities arising in practice is relatively less understood. In this talk, I will discuss some recent progress in this direction. In the first part of this talk, I will discuss a recently proposed framework for establishing the iteration complexity ofthe widely used Langevin Monte Carlo samplingalgorithm when the target density satisfies only the relatively milder Holder-smoothness assumption. Motivated by the theory of non-convex optimization, our guarantees are for converging to an appropriately defined first-order stationary solution for sampling. I will also discuss several extensions and applications of our result; in particular, it yields a new state-of-the-art guarantee for sampling from distributions which satisfy a Poincare inequality. In the second part of the talk, I will discuss guarantees for appropriately modified versions of the Langevin Monte Carlo sampling algorithm for sampling from heavy-tailed densities, i.e., densities that decay polynomially at the tails. These guarantees are established for the stronger Renyi metric.
Krishna Balasubramanian is an assistant professor in the Department of Statistics, University of California, Davis. His research interests include stochastic optimization and sampling, network analysis, and non-parametric statistics. His research was/is supported by a Facebook PhD fellowship, and CeDAR and NSF grants.
Hosted by Professor Quanquan Gu
Date(s) - Oct 13, 2022
4:15 pm - 5:45 pm
3400 Boelter Hall
420 Westwood Plaza Los Angeles California 90095