Case Sensitivity in Regression and Beyond

Abstract Number:

2411 

Submission Type:

Contributed Abstract 

Contributed Abstract Type:

Paper 

Participants:

Haozhen Yu (1), Yoonkyung Lee (1)

Institutions:

(1) The Ohio State University, Columbus, OH

Co-Author:

Yoonkyung Lee  
The Ohio State University

First Author:

Haozhen Yu  
The Ohio State University

Presenting Author:

Haozhen Yu  
N/A

Abstract Text:

The sensitivity of a model to data perturbations is key to model diagnostics and understanding model stability and complexity. Case deletion has been primarily considered for sensitivity analysis in linear regression, where the notions of leverage and residual are central to the influence of a case on the model. Instead of case deletion, we examine the change in the model due to an infinitesimal data perturbation, known as local influence, for various machine learning methods. This local influence analysis reveals a notable commonality in the form of case influence across different methods, allowing us to generalize the concepts of leverage and residual far beyond linear regression. At the same time, the results show differences in the mode of case influence, depending on the method. Through the lens of local influence, we provide a generalized and convergent perspective on case sensitivity in modeling that includes regularized regression, large margin classification, generalized linear models, and quantile regression.

Keywords:

Case Influence|Leverage|Model Diagnostics|Residual|Sensitivity Analysis|

Sponsors:

Section on Statistical Learning and Data Science

Tracks:

Machine Learning

Can this be considered for alternate subtype?

Yes

Are you interested in volunteering to serve as a session chair?

No

I have read and understand that JSM participants must abide by the Participant Guidelines.

Yes

I understand that JSM participants must register and pay the appropriate registration fee by June 1, 2024. The registration fee is non-refundable.

I understand