Efficient Estimation for Constrained Optimization Problems

Robert Kass Co-Author
Carnegie Mellon University
 
Konrad Urban First Author
Carnegie Mellon University
 
Konrad Urban Presenting Author
Carnegie Mellon University
 
Wednesday, Aug 6: 10:35 AM - 10:50 AM
2645 
Contributed Papers 
Music City Center 
Many estimands can be defined through constrained optimization problems with a stochastic component, for instance principal components analysis, constrained maximum likelihood estimation, and many penalized estimation problems.  To obtain asymptotic theory when an estimand is on the boundary of the constraint set, researchers have leveraged significant insight from the perturbation analysis of optimization problems, which studies how optimization problems vary under small changes in auxiliary parameters. Despite the previously developed asymptotic theory, the literature about the efficiency of such estimators has focused on finite-dimensional settings and convex objective functions.  We help fill this gap by showing how to derive efficient influence functions for general estimands defined through constrained optimization problems with potentially infinite-dimensional nuisance parameters.  We lean again on perturbation theory and offer general results for practitioners who may be interested in deriving influence functions for their own estimands of interest, as well as describe when pathwise differentiability may fail to hold.  We provide examples of how this theory can be applied to calculate influence functions for several specific estimands in both semiparametric and nonparametric settings to allow for efficient root-n estimation.

Keywords

Efficient influence function

M-estimation

Constrained optimization

Asymptotic theory

Perturbation analysis of optimization problems 

Main Sponsor

Section on Nonparametric Statistics