The power of online thinning in reducing discrepancy

Jan 15, 3pm, Gates 8102

Speaker: Aaditya Ramdas

Abstract: Consider an infinite sequence of independent, uniformly chosen points from [0,1]^d. After looking at each point in the sequence, an overseer is allowed to either keep it or reject it, and this choice may depend on the locations of all previously kept points. However, the overseer must keep at least one of every two consecutive points. We call a sequence generated in this fashion a two-thinning sequence. Here, the purpose of the overseer is to control the discrepancy of the empirical distribution of points, that is, after selecting n points, to reduce the maximal deviation of the number of points inside any axis-parallel hyper-rectangle of volume A from nA. Our main result is an explicit low complexity two-thinning strategy which guarantees discrepancy of O(log^{2d+1}(n)) for all n with high probability [compare with \Theta(\sqrt{n log log n}) without thinning]. The case d=1 of this result answers a question of Benjamini.

Paper: https://link.springer.com/article/10.1007/s00440-018-0860-y?wt_mc=Internal.Event.1.SEM.ArticleAuthorOnlineFirst