Speaker: Aaditya Ramdas
Abstract: Given K uncertainty sets that are arbitrarily dependent --- for example, confidence intervals for an unknown parameter obtained with K different estimators, or prediction sets obtained via conformal prediction based on K different algorithms --- we address the question of how to efficiently combine them in a black-box manner to produce a single uncertainty set. We present a simple and broadly applicable majority vote procedure that produces a merged set with (nearly) the same error guarantee as the input sets. We then extend this core idea in a few ways: we show that weighted averaging can be a powerful way to incorporate prior information, and a simple randomization trick produces strictly smaller merged sets (without altering the coverage guarantee). Finally, when deployed in online settings, we show how the exponential weighted majority algorithm can be employed in order to learn a good weighting over time.