Présidents de session
Architectures
- Gregor Kasieczka (Hamburg University)
- David Rousseau (IJCLab, CNRS/IN2P3, Université Paris-Saclay)
In this presentation I will discuss recent trends in the handling of systematic uncertainties in HEP analysis tasks, and techniques proposed to mitigate or remove their effect in the search for optimal selection criteria and variable transformations.
The approaches discussed include nuisance-parametrized models, modified adversary losses, semi-supervised learning approaches, inference-aware...
The development of an effective Uncertainty Quantification method that computes the predictive distribution by marginalizing over Deep Neural Network parameter sets remains an important, challenging task. In this context, Markov Chain Monte Carlo algorithms do not scale well for large datasets leading to difficulties in Neural Network posterior sampling. During this talk, we'll show that a...
Bayesian neural networks are a key technique when including uncertainty predictions into neural network analysis, be it in classification, regression or generation. Although being an essential building block for classical Bayesian techniques, Markov Chain Monte Carlo methods are seldomly used to sample Bayesian neural network weight posteriors due to slow convergence rates in high dimensional...
Predictions from empirical evidence come with many sources of potential uncertainty and error. First, the specific choices of models and concepts that we tack onto the observation give a strong prism to the resulting
conclusion. Uncertainty on which functional form to use in a model, naturally results in uncertainty of conclusions. Outside of mature (post-paradigmatic) quantitative sciences...