Heterogeneity-Adaptive Meta-Analysis

Emily Hector Co-Author
North Carolina State University
 
Elizabeth Davis First Author
 
Elizabeth Davis Presenting Author
 
Sunday, Aug 3: 5:05 PM - 5:20 PM
1533 
Contributed Papers 
Music City Center 
Meta-analytic methods tend to take all-or-nothing approaches to study-level heterogeneity, either limiting the influence of studies that are suspected to diverge from a shared model or assuming all studies are homogeneous. In this paper, we develop a heterogeneity-adaptive meta-analysis in linear models that adapts to the amount of information shared between datasets. The primary mechanism for the information-sharing is a shrinkage of dataset-specific distributions towards a new "centroid" distribution through a Kullback-Leibler divergence penalty. The Kullback-Leibler divergence is uniquely geometrically suited for measuring relative information between datasets. We establish our estimator's desirable inferential properties without assuming homogeneity between dataset parameters. Among other things, we show that our estimator has a provably smaller mean squared error than the dataset-specific maximum likelihood estimators, and establish asymptotically valid inference procedures. A comprehensive set of simulations illustrates our estimator's versatility, and an analysis of data from the eICU Collaborative Research Database illustrates its performance in a real-world setting.

Keywords

Data integration

Penalized regression

Information geometry

Stein shrinkage

Data privacy 

Main Sponsor

ENAR