The upcoming generation of cosmological surveys will aim to map the Universe in great detail and on an unprecedented scale, but involves new and outstanding challenges at all levels of the scientific analysis, from pixel level data reduction to cosmological inference.
As powerful as Deep Learning has proven to be in recent years, in most cases a DL approach alone proves to be insufficient to meet these challenges, and is typically plagued by issues including robustness to covariate shifts, interpretability, and proper uncertainty quantification, impeding their exploitation in a scientific analysis. In this talk, I will instead advocate for a unified approach merging the robustness and interpretability of physical models, the proper uncertainty quantification provided by a Bayesian framework, and the inference methodologies and computational frameworks brought about by the Deep Learning revolution.
In particular, we will see how deep generative models can be embedded within principled physical Bayesian modeling to solve a number of astronomical ill-posed inverse problems ranging from simple image denoising all the way to inferring the distribution of dark matter in the Universe. On the other hand, I will illustrate how, thanks to automatic differentiation, physical simulations can be embedded as layers in Deep Learning systems, with the example of integrating a cosmological N-body simulation within a Recurrent Inference Machine, for the purpose of reconstructing the initial conditions of the Universe.