Events
MaD Seminar: Rethinking Early Stopping: Refine, Then Calibrate
Speaker: Francis Bach (Ecole Normale Supérieure)
Location: 60 Fifth Avenue, Room 150
Date: Monday, May 19, 2025
Machine learning classifiers often produce probabilistic predictions that are critical for accurate and interpretable decision-making in various domains. The quality of these predictions is generally evaluated with proper losses like cross-entropy, which decompose into two components: calibration error assesses general under/overconfidence, while refinement error measures the ability to distinguish different classes. In this talk, I will provide theoretical and empirical evidence that these two errors are not minimized simultaneously during training. Selecting the best training epoch based on validation loss thus leads to a compromise point that is suboptimal for both calibration error and, most importantly, refinement error. To address this, we introduce a new metric for early stopping and hyperparameter tuning that makes it possible to minimize refinement error during training. The calibration error is minimized after training, using standard techniques. Our method integrates seamlessly with any architecture and consistently improves performance across diverse classification tasks. Joint work with Eugène Berta, David Holzmüller, and Michael Jordan (https://arxiv.org/abs/2501.19195).