Ilias Diakonikolas
Recent Advances in Algorithmic High-Dimensional Robust Statistics
Abstract: Fitting a model to a collection of observations is one of
the quintessential problems in machine learning. Since any model is
only approximately valid, an estimator that is useful in practice must
also be robust in the presence of model misspecification. It turns out
that there is a striking tension between robustness and computational
efficiency. Even for the most basic high-dimensional tasks, until
recently the only known estimators were either hard to compute or
could only tolerate a negligible fraction of errors.
In this talk, I will survey the recent progress in algorithmic
high-dimensional robust statistics. I will describe the first robust
and efficiently computable estimators for several fundamental learning
tasks that were previously thought to be computationally
intractable. These include robust estimation of mean and covariance in
high dimensions, robust learning of various latent variable models,
and robust stochastic optimization. The new robust estimators are
scalable in practice and have a number of applications in exploratory
data analysis and adversarial machine learning.