Présidents de session
Controlling uncertainties in generative models
- Thomas Vuillaume (LAPP, Univ. Savoie Mont-Blanc, CNRS)
Controlling uncertainties in generative models
- tommaso dorigo (INFN Sezione di Padova)
- Mehmet Ozgur Sahin (CEA-Saclay)
Recent progress in computer science, specifically the development normalising flows and diffusion models, has brought about a breakthrough in the fidelity of generative models in particle physics.
In this talk I will first review some of these new approaches and then discuss potential uses, considering the overall theme of uncertainties. This will allow us to discuss statistical properties,...
Our ability to collect data is rapidly increasing thanks to computational power and the unprecedented diversity of sensors. But how good are we at extracting, reconstructing, and understanding information from them? We present a short overview of some recent advancements for data-assimilation and modelling of turbulent multi-scale flows using both data-driven and equations-informed tools,...
In recent years, generative modeling has gained substantial momentum in genomics research thanks to increased availability of computational resources and development of deep generative models (DGMs) over the past decade. DGMs can learn the complex structure of genomic data and can be utilized for a variety of tasks such as generation of realistic artificial genomes, dimensionality reduction...
Given the recent success of diffusion models in image generation, we study their applicability to generating LHC phase space distributions. We find that they achieve percent level precision comparable to INNs. Training uncertainties are quantified by developing Bayesian versions to further enhance the interpretability of our results. In this talk, diffusion models are introduced and discussed...
Data-driven techniques are indispensable for addressing the limitations of Monte Carlo (MC) simulations in High Energy Physics experiments, such as insufficient statistics and process mismodeling. Accurate representation of background processes is essential for achieving optimal measurement sensitivity. Traditional approaches often involve the selection of a control region to model the...
Machine Learning improved the sensitivity in searches for massive long-lived neutral particles decaying in the Calorimeter by over 30%. This was only after supressing a large increase in the systeamtic errors caused by the method. The largest contribution to this improvement in senstivity is the use of a Recurrant Neural Network that separates signal from standard QCD multijet background and...