Latent Variable Modeling with Diversity-Inducing Mutual Angular Regularization

Feb 15 (Monday) at 1:30 pm GHC 8102

Speaker: Pengtao Xie

Abstract: One central task in machine learning (ML) is to extract underlying patterns, structure and knowledge from data. Latent variable models (LVMs) are principled and effective tools to achieve this goal. Due to the dramatic growth of volume and complexity of big data, several new challenges have emerged and cannot be effectively addressed by existing LVMs: (1) How to capture long-tail patterns that carry crucial information when the popularity of patterns is distributed in a power-law fashion? (2) How to reduce model complexity and computational cost without compromising the modeling power of LVMs? (3) How to improve the interpretability and reduce the redundancy of discovered patterns? To addresses the three challenges, we develop a novel regularization technique for LVMs, which controls the geometry of the latent space during learning to enable the learned latent components of LVMs to be diverse, to accomplish long-tail coverage, low redundancy, and better interpretability. In this talk, I will introduce: 1) how the diversity-inducing mutual angular regularizer (MAR) is defined; 2) how to optimize the MAR which is non-convex and non-smooth; 3) a theoretical analysis of why MAR is effective; 4) the applications of MAR in representation learning and distance metric learning.