Advances in Bayesian Inference on Causal, Longitudinal and Survival Models

Tianjian Zhou Chair
Colorado State University
 
Thursday, Aug 7: 8:30 AM - 10:20 AM
4202 
Contributed Papers 
Music City Center 
Room: CC-102A 

Main Sponsor

Section on Bayesian Statistical Science

Presentations

WITHDRAWN - A Bayesian Causal Model for Matrix-Valued Exposure with Applications to Radiotherapy Planning

In cancer radiotherapy, radiation dose to organs-at-risk (OARs) adjacent to the target tumor should be minimized to avoid toxicity. Dose-volume histograms (DVH) used to summarize radiation exposure can be arranged in a matrix form to represent the dose to multiple OARs. Understanding the causal link between this matrix-valued exposure and toxicity could inform treatment planning, but conventional causal models are not tailored to high-dimensional matrix-valued data. We propose a Bayesian joint model for matrix-valued DVH exposure, with regularization. Dimension reduction is achieved via multilinear principal component analysis, which extracts features from both rows and columns of the matrix. A Hamiltonian Monte Carlo algorithm is adapted for estimation. Simulations assess the model's performance, and an application is presented to demonstrate the model's ability to identify relevant effects. For interpretation, the dose effects are mapped back to the original DVH matrix. We also extend the model to account for biologically monotonic dose effects on toxicity outcomes using a projection approach and discuss how this monotonicity constraint impacts causal interpretation of the model. 

Keywords

Bayesian Modeling

Causal Inference

Dose-Volume Histograms

Matrix-Valued Data

Multilinear Principal Component Analysis

Radiotherapy Planning 

Co-Author(s)

Zhihui (Amy) Liu, Princess Margaret Cancer Centre, University Health Network
Olli Saarela, University of Toronto

First Author

Zijin (Frank) Liu, University of Toronto

Bayesian Analysis of Clinical Trials with a Delayed Treatment Effect

In the analysis of clinical trials with a survival endpoint, it is important to account for the possibility of a delay in the separation of the survival curves, known as a delayed treatment effect (DTE). DTEs are commonly observed in immuno-oncology trials, and failing to account for the possibility of a DTE in the design and analysis of a trial can lead to a substantial loss of power. We introduce a Bayesian method for assessing the efficacy of a treatment in the presence of a DTE. In our method, the baseline hazard is modeled using a cubic spline, which has the advantage over a piecewise constant model of capturing any smooth shape. To account for the treatment effect delay, we specify a non-constant hazard ratio, estimating the delay instead of prespecifying it. We study the properties of the methodology and demonstrate superior power of our method over the Cox proportional hazards model in the case of a nonzero delay. The proposed methodology is applied to the analysis of an immuno-oncology trial for first-line treatment of extensive-stage small-cell lung cancer. 

Keywords

Clinical trial design

Delayed treatment effect

Bayesian design

Cubic spline

Immuno-oncology trial 

Co-Author(s)

Qing Li, Merck & Co., Inc.
Jia Hua, Merck
Amarjot Kaur, Merck & Co., Inc.
Joseph Ibrahim, University of North Carolina

First Author

Anil Anderson, University of North Carolina at Chapel Hill

Presenting Author

Anil Anderson, University of North Carolina at Chapel Hill

Causal inference in meta-analyses with Dirichlet process mixture models and Bayesian networks

We propose a flexible framework for conducting meta-analyses from published papers. The method approximates the joint probability density function for similar variables in each constituent paper with a hierarchical Dirichlet process Gaussian mixture model. Hyperparameters for the mixture component parameters and the mixing proportions are allowed to correlate among subsets of papers with similar properties. With this posterior for the density function, we generate new and complete data observations for use in causal inference with structure learning on Bayesian networks. This framework incorporates whole datasets in the meta-analysis, provides a flexible means to handle heterogeneity in study design with correlated hyperparameters, and mitigates issues with causal sufficiency endemic to causal inference with Bayesian networks by combining latent and observed variables across studies. We apply this technique to studies of the gut microbiome, show that its predictions are viable, and demonstrate the key insights it offers. While open data access continues to proliferate, we proffer a novel means of data reuse. 

Keywords

meta-analysis

causal inference

Dirichlet process mixture model

Bayesian hierarchical modeling

Bayesian network

data reuse 

Co-Author

Eric Vance, LISA, University of Colorado-Boulder

First Author

Ellery Galvin

Presenting Author

Ellery Galvin

Counterpart Statistics in the Matched Difference-in-Differences Design

Difference-in-differences (DiD) estimates intervention effects under the parallel trends assumption, but nuisance trends can bias estimates. Matching methods that balance pre-intervention trends have been used, yet we show they fail to adjust for latent confounders and introduce regression to the mean bias. Instead, we advocate for methods grounded in explicit causal assumptions about selection bias. We also propose a Bayesian approach to assess parallel trends, avoiding the challenges of specifying non-inferiority thresholds. We demonstrate our method using Medical Expenditure Panel Survey data to estimate the impact of health insurance on healthcare utilization. 

Keywords

Difference-in-differences

Matching

Non-equivalent control

Measurement error

Health policy evaluation

Triple difference 

Co-Author

Bo Lu, The Ohio State University

First Author

Sean Tomlin, The Ohio State University

Presenting Author

Sean Tomlin, The Ohio State University

Refining Subgroup Analysis in Clinical Trials: A Bayesian Hierarchical Approach

Subgroup analysis is crucial in clinical trials for understanding how different subgroups respond to treatments and identifying patient characteristics that may influence efficacy or safety. However, it faces limitations, including small sample sizes, increased variability, and the risk of false discoveries due to multiple comparisons. These challenges often result in unreliable conclusions and hinder the generalization of findings.
Bayesian Hierarchical Modeling (BHM) with shrinkage estimation addresses these issues by modeling subgroup-specific parameters as deviations from the population parameter, allowing information to be shared across subgroups. This hierarchical structure stabilizes estimates by pulling extreme values toward more plausible estimates derived from the overall data, improving precision, reducing bias, and enhancing generalizability (Pennello G. et al,). It also quantifies uncertainty, providing clearer insights into treatment effects and supporting better decision-making particularly in the presence of unbalanced data.
The concept of BHM and its implementation in subgroup analysis will be discussed in detail with a simulated example from oncology trial. 

Keywords

Bayesian inference

shrinkage estimation

subgroup analysis 

First Author

Veerendra Nayak, Cytel Inc

Presenting Author

Veerendra Nayak, Cytel Inc

Variable Selection in Threshold Regression Models for Survival Data

Threshold regression, or first-hitting-time regression, is an alternative to the Cox proportional hazards model when the proportional hazards assumption is violated for survival data. It defines the event time as the first time a latent stochastic process hits a boundary. When the underlying process is a Wiener diffusion process, the event time follows an inverse Gaussian distribution. This process is characterized by the initial level at time zero and the degradation rate, it can be used to model health trajectories through separate functions for baseline health and degradation rate but complicates variable selection. This study evaluated variable selection methods for threshold regression using simulations, comparing frequentist approaches (forward selection, backward selection, ThregBAR) and Bayesian methods (horseshoe, LASSO). Bayesian LASSO demonstrated accurate, stable performance, while Bayesian horseshoe was sensitive to scaling. Among frequentist methods, forward selection performed best, ThregBAR had the lowest false-negative rates, and backward selection was least effective. 

Keywords

First hitting time regression model

Survival Data

Bayesian Methods

Variable selection 

Co-Author

Michael Pennell, The Ohio State University

First Author

Shuxian Ning, The Ohio State University

Presenting Author

Shuxian Ning, The Ohio State University

Two methods of Bayesian model averaging (2D Monte Carlo type) to assess effects of covariate errors

Measurement error alters exposure-response shape and hence extrapolated risk. Bayesian model averaging (BMA) methods of dealing with shared errors, common in many datasets, have received much attention. We test two types of BMA model, quasi-2DMC+BMA, similar to the BMA method proposed by Hoeting et al but distinct from the 2DMC+BMA method of Kwon et al (Stat Med 2016 35 399-423). The second (and newer type) we term marginal-quasi-2DMC+BMA, using a more complex marginal calculation, closer to 2DMC+BMA. Assuming a true linear exposure-response model, coverage probabilities for the linear coefficient are 90-95% for quasi-2DMC+BMA, but only 52-60% for marginal-quasi-2DMC+BMA. Assuming a true linear-quadratic model coverage probabilities of both linear and quadratic coefficients for quasi-2DMC+BMA are <5% when shared Berkson error is 50%. By comparison, coverage probabilities for both linear and quadratic coefficients for the marginal-quasi-2DMC+BMA method are generally too high, ~ 100%. The poor coverage results from substantial bias, both positive and negative. In summary the performance of both quasi-2DMC+BMA and marginal-quasi-2DMC+BMA methods are bad, with bias and poor coverage. 

Keywords

Covariate measurement error

Bayesian model averaging

Radiation

Classical error

Berkson error 

Co-Author(s)

Nobuyuki Hamada, Biology and Environmental Chemistry Division, Sustainable System Research Laboratory, CRIEPI
Lydia Zablotska, UCSF

First Author

Mark Little, Radiation Epidemiology Branch, National Cancer Institute

Presenting Author

Mark Little, Radiation Epidemiology Branch, National Cancer Institute