On improved matrix estimators in high-dimensional data

Arash Foroushani Co-Author
University of Windsor
 
Severien Nkurunziza First Author
University of Windsor
 
Severien Nkurunziza Presenting Author
University of Windsor
 
Sunday, Aug 3: 5:20 PM - 5:35 PM
1965 
Contributed Papers 
Music City Center 
In this talk, we introduce a class of improved estimators for the mean parameter matrix of a multivariate normal
distribution with an unknown variance-covariance matrix. In particular, some recent results of are established in their full generalities and we revise some results which are useful in studying the risk dominance of shrinkage estimators. We generalize the existing methods in three ways. First, we consider a parametric estimation problem which is enclosed as a special case the one about the vector parameter. Second, we propose a class of James-Stein matrix estimators and, we establish a necessary and a sufficient condition for any member of the proposed class to have a finite risk function. Third, we present the conditions for the proposed class of estimators to dominate the maximum likelihood estimator. On the top of these interesting contributions, the additional novelty consists in the fact that, we extend the methods suitable for the vector parameter case and the derived results hold in the classical case as well as in the context of high and ultra-high dimensional data.

Keywords

Invariant quadratic loss

James-Stein estimation

Location parameter

Minimax estimation

Moore-Penrose inverse

Risk function 

Main Sponsor

IMS