Speaker: Alden Green
Abstract: Nonparametric goodness-of-fit testing is a classical statistical problem, in which we observe data y_i = f(x_i) + epsilon for x_i in R^d and wish to test the null hypothesis f = 0 against the alternative that f lies in some smooth function class. For instance, we may assume f lies in a Sobolev ball. Often, in order to derive standard minimax rates for this problem, the assumption will be made -- either explicitly or implicitly -- that the Sobolev ball exhibits a certain degree of regularity, measured by the number of derivatives k relative to the dimension d. In particular, the assumption is often made that 2k > d. We will begin by reviewing facts about Sobolev spaces, to establish why the threshold 2k = d is special, why the assumption 2k > d is made, and what properties we retain / lose when we don’t want to make this assumption. We will then move on to reviewing a paper of Ingster’s, which shows that a very classical projection-based test statistic can achieve minimax optimal rates as long as 4k > d. (As we will see from the preceding discussion, this will include functions which are qualitatively quite different from those for which 2k > d.) Finally, if time permits, I will detail some recent work which shows that analogous test statistics, defined over graphs, can also achieve minimax optimal rates without requiring 2k > d.