Speaker: Garvesh Raskutti
Abstract:
In large-scale data settings, randomized 'sketching' has become
an increasingly popular tool. In the numerical linear algebra literature,
randomized sketching based on either random projections or
sub-sampling has been shown to achieve optimal worst-case
error. In particular the sketched ordinary least-squares (OLS) solution
and the CUR decomposition have been shown to achieve optimal
approximation error bounds in a worst-case setting. However, until
recently there has been limited work on consider the performance of
the OLS estimator under a statistical model using statistical metrics. In
this talk I present some recent results which address both the
performance of sketching in the statistical setting, where we assume
an underlying statistical model and show that many of the existing
intuitions and results are quite different from the worst-case algorithmic
setting.
This is based on joint work with Michael Mahoney.
Reference: http://arxiv.org/abs/1505.06659.