Speaker: Tudor Manole
Abstract: The MLE is perhaps the most common estimator for fitting finite mixture models, but it presents significant computational challenges due to the lack of convexity of the space of mixing measures with a bounded number of components. An appealing alternative is the nonparametric maximum likelihood estimator (NPMLE), which maximizes the likelihood over the convex set of all mixing measures, including those which are not finite. Beyond its computational tractability, the mixture NPMLE is appealing in that is a completely tuning-parameter free estimator, unlike other tractable estimators like the method of moments. In this talk, I will present a few recent results on the mixture NPMLE. I will spend most of my time discussing paper [1], which proves (very surprisingly) that this estimator performs a form of automatic model selection: with high probability, the NPMLE is supported on a set which is finite, and whose cardinality grows logarithmically with the sample size. I will then explain how this result can be used to re-derive known convergence rates for the NPMLE in Hellinger distance. If time permits, I will close with a broader discussion on open questions about the minimax rate for estimating a mixture density in Hellinger distance, in particular making references to paper [2]. [1] https://arxiv.org/pdf/2008.08244.pdf [2] https://proceedings.mlr.press/v195/jia23a/jia23a.pdf