Events
Exact Asymptotics with Approximate Message Passing and a Study of the Type 1-Type 2 Error Trade-off for SLOPE
Speaker: Cynthia Rush
Location: 60 Fifth Avenue, Room 150
Date: Thursday, February 9, 2023
Approximate message passing (AMP) is a class of iterative algorithms that can be used to systematically derive exact expressions for the asymptotic risk and other performance metrics for estimators that are constructed as solutions to a broad class of convex optimization problems. In this talk, we present a general program for using AMP in this way and we provide a specific example by using this approach to study the asymptotic model selection properties of sorted L1 penalized estimation (SLOPE). Sorted L1 regularization has been incorporated into many methods for solving high-dimensional statistical estimation problems, including using SLOPE in the context of linear regression. We will show how this regularization technique improves variable selection relative to the LASSO by characterizing the optimal SLOPE trade-off between the false discovery proportion and true positive proportion or, equivalently, between measures of type I and type II error. Collaborators on this work include Zhiqi Bu, Jason Klusowski, and Weijie Su (https://arxiv.org/abs/1907.07502 and https://arxiv.org/abs/2105.13302) and Oliver Feng, Ramji Venkataramanan, and Richard Samworth (https://arxiv.org/abs/2105.02180).