Orateur
David Yallup
(University of Cambridge)
Description
The frontier of likelihood free inference typically involves density estimation with Neural Networks at it's core. The resulting surrogate model used for inference is faced with the well established challenges of capturing the modelling and parameter uncertainties of the network. In this contribution I will review progress made in building Neural Networks trained with Nested Sampling, which represents a novel form of Bayesian Neural Nets. This new paradigm can uniquely capture the modelling uncertainty and provides a new perspective on the fundamental structure of Neural Networks.
Auteur principal
David Yallup
(University of Cambridge)
Co-auteur
Will Handley
(University Of Cambridge)