Talk By: PRAVESH KODHARI – Average-Case Algorithm Design Using Sum-of-Squares – IAS Princeton

Abstract:

Finding planted “signals” in random “noise” is a theme that captures distributional problems arising in several different areas such as machine learning (compressed sensing, matrix completion, sparse principal component analysis, regression, recovering planted clusters), average-case complexity (stochastic block model, planted clique, random constraint satisfaction), and cryptography (attacking security of pseudorandom generators). For some of these problems (e.g. variants of compressed sensing and matrix completion), influential works in the past two decades identified the right convex relaxation and techniques for analyzing it that guarantee nearly optimal (w.r.t. to the information theoretic threshold) recovery guarantees. However these methods are problem specific and do not immediately generalize to other problems/variants.

This talk is about a principled approach via the sum-of-squares method, to design and analyze a “right” convex relaxation that appears to offer optimal recovery guarantees for a broad class of average case problems including all those listed above.

I’ll explain how the process of coming up with the convex relaxation and its analysis in this paradigm can be blackboxed into showing a “simple” proof of unique identifiability (i.e., a statement that the signal (such as added clique) is information-theoretically uniquely identified by the observed data). As an example, I’ll focus on a recent work that yielded a quasi-polynomial time algorithm for separating mixtures of spherical gaussians at the statistical threshold. No sub-exponential time algorithm was known previously in this regime.

 

Date/Time:
Date(s) - Jan 10, 2019
4:15 pm - 5:45 pm

Location:
Engineering VI – Room 289
404 Westwood Plaza Los Angeles California 90095