Adaptive Sampling for Convex Regression

09 Sep 2021, 2:00p - 3:30p, NSH 3305

Speaker: Ojash Neopane

Abstract: In frequentist statistics, we typically evaluate the performance of a procedure through the lens of minimax optimality. However, for many problems, these notions of minimax optimality turn out to be overly pessimistic. As such, over the past decade, there has been a rising interest in studying so-called local minimax optimality, which provides a more fine grained characterization of the fundamental limits of various statistical procedures. In this talk, we will focus on the problem of adaptively sampling in order to learn a convex function in the ell-infinity norm. We will discuss both local minimax lower bounds and matching upper-bounds for the aforementioned problem, as well as the advantages offered by adaptive sampling over passive designs. https://arxiv.org/abs/1808.04523