Assumption-Lean Inference for Spectral Differential Network Analysis of High-Dimensional Time Series

Michael Hellstern Speaker
 
Tuesday, Aug 5: 2:05 PM - 2:25 PM
Topic-Contributed Paper Session 
Music City Center 
Analyzing multivariate time series networks is popular in many fields from neuroscience to seismology. The inverse spectral density is a common choice for time series network analysis due to its representation of the frequency domain correlation between two variables after removing the best linear predictor of all other variables. In many applications, the goal is to study how these networks change across different conditions. For example, in neuroscience, one might be interested in how the brain connectivity network changes before and after stimulation. With this in mind, we develop a direct estimate of the difference in two high-dimensional inverse spectral densities. By leveraging recent advances in multivariate time series analysis, we establish consistency of our estimator only assuming mild dependence conditions. Using a new convergence rate on high-dimensional spectral density estimators, we obtain a flexible convergence rate for the proposed direct estimator that allows for both varying smoothing spans and dependence in the data. Leveraging this convergence rate and new results on the form of the asymptotic distribution of the spectral density estimator, we also develop a valid inference procedure that handles asymptotic distributions with arbitrary scaling. Finally, to make the procedure computationally tractable, we utilize previously overlooked estimating equations to implement an efficient algorithm. The method is illustrated on synthetic data experiments, on experiments with electroencephalography data, and on experiments with optogentic stimulation and micro-electrocorticography data.