Generalized Bayesian Inference for Dynamic Random Dot Product Graphs
Monday, Aug 4: 11:35 AM - 11:40 AM
1175
Contributed Speed
Music City Center
The random dot product graph (RDPG) is a popular model for network data with extensions that accommodate dynamic (time-varying) networks. However, two significant deficiencies exist in the dynamic RDPG literature: (1) no coherent Bayesian way to update one's prior beliefs about the model parameters due to their complicated constraints, and (2) no approach to forecast future networks with meaningful uncertainty quantification. This work proposes a generalized Bayesian framework that addresses these needs using a Gibbs posterior that represents a coherent updating of Bayesian beliefs based on a least-squares loss function. Furthermore, we establish the consistency and contraction rate of this Gibbs posterior under commonly adopted Gaussian random walk priors. For estimation, we develop a fast Gibbs sampler with a time complexity that is linear in both the number of time points and observed edges in the dynamic network. Simulations and real data analyses show that the proposed method's in-sample and forecasting performance outperforms that of competitors.
Gibbs posterior
Dynamic network data
Network latent space models
statistical network analysis
Bayesian inference
Forecasting
Main Sponsor
Section on Statistical Learning and Data Science
You have unsaved changes.