Robust Learning of Mixtures of Gaussians

Date: October 13, 2022
Time: 4:00 pm
Room: DBH 4011
Speaker: Pravesh Kothari 
(CMU)
Abstract:
or a while now, the problem of robustly learning a high-dimensional mixture of Gaussians has had a target on its back. The first works in algorithmic robust statistics found robust algorithms for learning a single Gaussian in 2016. Since then, there has been steady progress including clustering spherical mixtures at statistically optimal separation, clustering non-spherical mixtures, and the related problem of list-decoding mean and covariance in the presence of majority fraction outliers.
In this talk, I will describe our recent resolution to this problem. I will focus on new tools developed in a sequence of works that relate robust clustering to phenomena in high dimensional probability such as anti-concentration and hypercontractivity, their algorithmic counterparts based on the sum-of-squares method and new assumption-free guarantees for the tensor decomposition problem.
Based on joint works with Ainesh Bakshi, Ilias Diakonikolas, Misha Ivkov, He Jia, Daniel Kane, Jacob Steinhardt, David Steurer,
Bio:

Pravesh K. Kothari is an Assistant Professor of Computer Science at CMU. Highlights of his recent work include a sum-of-squares framework for problems in robust statistics such as learning mixtures of arbitrary Gaussians, optimal spectral algorithms for smoothed constraint satisfaction and applications to extremal combinatorics, and sum-of-squares thresholds for average-case optimization problems such as planted clique and random CSPs. His work has been recognized with an NSF Career Award, a Sloan Faculty Fellowship, and a Google Research Scholar Award.  

Close Menu
Skip to content