Feature-specific inference for penalized regression models using local false discovery rates
Tuesday, November 12, 2019, 3:30 PM, Davis 201
Refreshments at 3:00 PM, Davis 2nd floor
Regression modeling is a powerful statistical tool with well-studied inferential methods available for common models including least squares linear regression, logistic regression, Cox proportional hazards regression, and others. However, these classical models break down when the number of explanatory variables exceeds the sample size. Penalized regression models provide an attractive approach to analyzing high-dimensional data, but their inferential tools are less well-developed than their un-penalized counterparts. Many popular penalization approaches, most notably the LASSO, naturally perform variable selection, prompting the question “how confident can we be in these selections?” as starting point for inference. In this talk we seek to answer that question, beginning with an introduction to LASSO regression and progressing to use the optimization conditions that characterize the LASSO solution to develop feature-specific local false discovery rate estimates for each explanatory variable under consideration. We demonstrate the validity of this approach and compare it with several other inferential methods currently available for the analysis of high-dimensional data.