**Rina Foygel Barber**Professor of Statistics at the University of Chicago

310 Sutardja Dai Hall

April 7, 2023

Algorithmic stability is a framework for studying the properties of a model fitting algorithm, with many downstream implications for generalization, predictive inference, and other important statistical problems. Stability is often defined as the property that predictions on a new test point are not substantially altered by removing a single point at random from the training set. However, this stability property itself is an assumption that may not hold for highly complex predictive algorithms and/or nonsmooth data distributions. This talk will present two complementary views of this problem. In the first part, we show that it is impossible to infer the stability of an algorithm through “black-box testing”, where we cannot study the algorithm theoretically but instead try to determine its stability properties by the behavior of the algorithm on various data sets, when data is limited. In the second part, we establish that bagging any black-box algorithm automatically ensures that stability holds, with no assumptions on the algorithm or the data. This work is joint with Byol Kim, Jake Soloff, and Rebecca Willett.

#### Speaker Bio

I am a Professor in the Department of Statistics at the University of Chicago. Before starting at U of C, I was a NSF postdoctoral fellow during 2012-13 in the Department of Statistics at Stanford University, supervised by Emmanuel Candès. I received my PhD in Statistics at the University of Chicago in 2012, advised by Mathias Drton and Nati Srebro, and a MS in Mathematics at the University of Chicago in 2009. Prior to graduate school, I was a mathematics teacher at the Park School of Baltimore from 2005 to 2007.