Orateur
Description
Real-world datasets often comprise sets of observations that collectively constrain the parameters of an underlying model of interest. Such models typically have a hierarchical structure, where "local" parameters impact individual observations and "global" parameters influence the entire dataset. In this talk we introduce Bayesian and Frequentist approaches for optimal dataset-wide probabilistic inference in cases where the likelihood is intractable, but simulations can be realized via forward modeling. We construct neural estimators for the likelihood(-ratio) or posterior and show that explicitly accounting for the model's hierarchical structure can lead to tighter parameter constraints. We illustrate our methods using case studies from particle physics and astrophysics.
Based on: https://arxiv.org/abs/2306.12584