27 novembre 2023 à 1 décembre 2023
Fuseau horaire Europe/Paris

Data Subsampling for Bayesian Neural Networks

1 déc. 2023, 09:45
25m
Architectures (Adversarial, Bayesian, ... ) Architectures

Orateur

Eiji Kawasaki (CEA)

Description

The development of an effective Uncertainty Quantification method that computes the predictive distribution by marginalizing over Deep Neural Network parameter sets remains an important, challenging task. In this context, Markov Chain Monte Carlo algorithms do not scale well for large datasets leading to difficulties in Neural Network posterior sampling. During this talk, we'll show that a generalization of the Metropolis Hastings algorithm allows to restrict the evaluation of the likelihood to small mini-batches in a Bayesian inference context. Since it requires the computation of a so-called “noise penalty” determined by the variance of the training loss function over the mini-batches, we refer to this data subsampling strategy as Penalty Bayesian Neural Networks – PBNNs.

Auteur principal

Eiji Kawasaki (CEA)

Co-auteur

Prof. Markus Holzmann (Univ. Grenoble Alpes, CNRS, LPMMC, 38000 Grenoble, France)

Documents de présentation