Orateur
Description
With large heaps of time-domain active galactic nuclei (AGN) data in multiple broadband wavelengths anticipated from upcoming surveys such as the Vera C. Rubin Observatory’s Legacy Survey of Space and Time, statistical models must be equipped to process this irregularly sampled light curve data at scale and better represent the optical and UV variability of the compact region near the central black hole without making any underlying assumptions. Attention-based architectures have quickly become the de facto standard in deep learning models for sequential data due to their ability to bridge long-term temporal dependencies and added allowance for intra-training example parallelization that is fundamentally not possible in recurrent models like LSTMs. We consider a particular time-based attention model for this problem, HeTVAE (Heteroscedastic Temporal Variational Autoencoder; Shukla and Marlin 2021), which poses notable promise by propagating uncertainty through a ‘sparsity-aware’ heteroscedastic output layer in its interpolation of a sparse and multivariate time series with significantly shorter run times than Gaussian Process Regression-based methods that reflect uncertainty through posterior inference. An appropriate and scalable means of reconstructing the light curves gives way to more adequately exploring the underlying process of the AGN’s variability and the black hole’s physical properties in the LSST era, specifically with regard to estimating time lags from the UV and optical reprocessing of X-ray variability close to the black hole.