The Promises and Perils of Long Time: Recent Advances in Astronomical Time Series

Yang Chen Chair
University of Michigan
 
Vinay Kashyap Organizer
Center for Astrophysics | Harvard & Smithsonian
 
Aneta Siemiginowska Organizer
Harvard-Smithsonian Center for Astrophysics
 
Wednesday, Aug 7: 10:30 AM - 12:20 PM
1814 
Topic-Contributed Paper Session 
Oregon Convention Center 
Room: CC-B119 
Astronomy, as an observation science, has a unique relationship with data. New telescopes are regularly built, with more sensitive instruments, operating over many wavelengths, with new modes of measurement like neutrinos and gravitational waves, and the data quality keeps increasing. Many observations are unrepeatable because they document one-time phenomena (like supernovae or flares) each of which have unique characteristics, and yet must be analyzed in a population that spans observations over decades or centuries. This poses a challenge to analysis when older data must be used together with newer data. This is important especially because the variability of sources carries information about physical processes, e.g., allowing us to constrain the physical sizes of unresolved sources, measure masses, detect exoplanets, etc.

We focus here on long duration time series data. A prominent example is the record of the daily sunspot numbers which has been maintained since the mid-1700s, and the data have been steadily augmented with additional proxies like magnetic field and radio measurements. The generation-spanning maintenance of these data presents an incredible analysis challenge to statisticians. A rich trove of photometric plate surveys (existing since mid-1800s) are in the process of being digitized (DASCH). Recently, we have had space telescopes like Hubble (optical) and Chandra and XMM-Newton (X-ray) that have been observing for several decades continuously, and which are augmented by a fleet of smaller missions and new great observatories like the JWST. The EUV and X-ray emission from the Sun is monitored with space-borne telescopes like GOES and SDO over decades. Ground based surveys like PanSTARRS and the ZTF have been producing years-long light curves, compiling information at high cadence and sensitivity. Telescopes like Kepler and TESS have compiled high-quality data on selected stars in the galaxy as a byproduct of exoplanet huntin. But all these datasets are set to be dwarfed by the next generation of surveys like the Square Kilometer Array and the Rubin LSST, which will produce an order of magnitude more raw data and bring a qualitative revolution to astronomy.

The challenges in merging information across this large variety of data streams is self-evident. Analyses require a multi-disciplinary effort to avoid pitfalls of improper or false inferences. Astronomers are recognizing this and have begun the work to strategize solutions (e.g., working groups on variability monitoring strategies for HST and JWST, the Information and Stats collaboration for LSST). Our session brings together astronomers and statisticians who have been working on these problems and can speak of their hard-won triumphs. It will introduce the challenges involved in collecting and analyzing long duration astro datasets, and demonstrate the techniques currently used. We expect this to spark discussions between astronomers and statisticians and lead to new and improved methods to tease out qualitatively new information from these datasets.

Applied

Yes

Main Sponsor

Section on Physical and Engineering Sciences

Co Sponsors

Astrostatistics Interest Group
Section on Statistical Computing

Presentations

Predicting a Multi-Peak Solar Cycle using a Multi-Stage Analysis

We develop a data-driven approach to describe the multi-peak behavior of the solar activity cycle. The method builds upon a multilevel Bayesian model for a single-peaked solar cycle. While the latter uses only monthly mean sunspot numbers as a proxy for solar activity, our approach incorporates additional physical data and uses Gaussian process regression to capture complex features of the solar cycle that are missed by the single-peak, single-proxy model. We demonstrate the capabilities of our methodology using hindcasts of previous cycle morphologies, and we make a prediction for the timing and characteristics of the upcoming solar cycle maximum. 

Co-Author

David Stenning, Simon Fraser University

Speaker

Vinay Kashyap, Center for Astrophysics | Harvard & Smithsonian

Unravelling the Physics of Black Holes Using Astronomical Time Series: From Seconds to Decades

Black holes are at the heart of many open questions in astrophysics. They are prime laboratories to study the effects of strong gravity, and are thought to play a significant role in the evolution of the universe. Significant knowledge of these sources comes from studies of X-ray binaries, where a black hole exists in a binary system with a star. Of particular interest are time series of their brightness as a function of time, usually in X-rays. These time series show complex variations on timescales from sub-seconds to decades. Connecting properties of these (often stochastic) time series to physical models of how matter falls into black holes enables probes of fundamental physics, but requires sophisticated methods. In this talk, I will introduce black holes as important astrophysical sources and give an overview of the types of data we observe from them with X-ray telescopes. I will give an overview of current statistical and machine learning methods for characterizing the information of the physical system contained in these data sets, and present both the state-of-the-art and future directions of time series analysis for black holes. 

Speaker

Daniela Huppenkothen

Classifying the Sky with ZTF

The Zwicky Transient Facility (ZTF) has successfully collected hundreds of data points each for over a billion sources in the Northern sky over the last few years, creating a rich dataset for astronomical analysis. Leveraging advanced machine learning techniques, specifically deep neural networks (DNN) and XGBoost, we've developed binary classifiers to sift through this vast time-series data, enhancing our ability to classify astronomical phenomena accurately. We outline the methodologies employed, challenges encountered, and the innovative solutions devised. Additionally, we explore the integration of data from various surveys, the impact of observational cadences, and the implications for future surveys like LSST/Rubin, particularly in the context of transfer learning and the pursuit of fainter celestial objects and outliers.
 

Speaker

Ashish Mahabal, Caltech

Discovering Anomalous Physics in Realtime

Supernovae mark the explosive deaths of stars and enrich the cosmos with heavy elements. Future telescopes will discover thousands of new supernovae nightly, creating a need to flag astrophysically interesting events rapidly for followup study. Ideally, such an anomaly detection pipeline would be independent of our current knowledge and be sensitive to unexpected phenomena. I will discuss recent developments in building multi-modal and both physics-informed/agnostic anomaly detection algorithms for multi-variate time series. 

Speaker

V. Ashley Villar, Harvard University

Detecting stellar flares using conditional volatility

For more than forty years now, discrete-time models have been developed to reflect the so-called stylized features of financial time series. These properties, which include tail heaviness, asymmetry, volatility clustering and serial dependence without correlation, cannot be captured with traditional linear time series ARMA. Continuous-time ARMA (CARMA) are the continuous-time version of the well-known ARMA models, and they are convenient for modeling astronomical data, which are often unequally spaced in time. In this talk we will review ARMA and CARMA models and their application in astrophysics. We then present a novel and powerful method to analyze time series to detect flares in TESS light curves. First, we remove the trend using a time-varying deterministic harmonic fit so to capture changes in the deterministic amplitude of the light curve. Then we enlighten the analogy between the stochastic part of the light curves and GARCH processes. We demonstrate that flares can be detected as significantly large deviations from the baseline. We apply the method on exemplar light curves from two flaring stars, and discuss some of the diagnostics that become amenable to measurement. 

Co-Author

Vinay Kashyap, Center for Astrophysics | Harvard & Smithsonian

Speaker

Giovanni Motta, Columbia University