Speaker: Kevin Lin
Abstract: For this week in SMLRG, I will be presenting entry-wise perturbation bounds for eigenvectors as proven in recent work by Eldridge, Belkin, and Wang. While this work is not my own, I am invoking their results quite a bit for my own research and have found these results worthwhile to keep in mind. Specifically, I will first briefly review the stochastic block model (SBM), where the goal is to estimate the clustering of nodes in a random graph, and the classical Davis-Kahan bound (following the logic in Jing and Ale's AoS paper) as a baseline analysis to compare against. Then, I present the intuition/results/rate on how Eldridge et. al deploy their entrywise perturbation bounds in the SBM setting. These rates can immediately conclude that spectral clustering can obtain exact clustering consistency with only a few extra (mild) assumptions compared to previous work. With the remaining time, I will close the talk with open questions/discussion/ramblings related to incoherence, delocalization in random matrix theory, and possibly extensions to broader classes of random graphs beyond SBMs. For those interested in using these rates in their own work, please be aware that minor clarifications/typos in Eldridge et. al's work have been fixed in a work by Li, Levina, and Zhu, which I will be using in my talk.
This is joint work with Edward, Jisu and Larry. In this talk, I will focus more on the idea (i.e. how the two different concepts - clustering and causal inference - can be harmonized together), and how the semi-parametric and SML theories can help to develop more appealing estimators.