Barnum-Simons Chair in Mathematics and Statistics at Stanford University, and Professor of Electrical Engineering (by courtesy)
310 Sutardja Dai Hall (Banatao Auditorium)
Wednesday, January 18, 2023
Conformal inference methods are becoming all the rage in academia and industry alike. In a nutshell, these methods deliver exact prediction intervals for future observations without making any distributional assumption whatsoever other than having iid, and more generally, exchangeable data. However, even this minimal assumption may fail since in most settings the distribution of observations can shift drastically—think of finance or economics where market behavior can change in response to new legislation or major world events, or public health where changes occur because of policies. What are we to do then? This talk will introduce two new methods to deal with this situation.
The first introduces a new method, which allows for regression algorithms that are not symmetric in the training data with no resulting loss of coverage if the data points are indeed iid/exchangeable. Next, if in fact the data points are not iid/exchangeable, then placing weights on the training data points before running the method enables the final prediction interval to be robust vis a vis changes in the distribution of the data. The second method had a flavor of an online algorithm: it operates by correcting naive prediction sets constructed by naive methods and produces prediction intervals that are robust to deviations occurring over time. We will illustrate our new methods via examples predicting election results or COVID19-case trajectories.
This is joint work with Isaac Gibbs, and with Rina Barber, Aaditya Ramdas and Ryan Tibshirani.