Experimental verification of the Higgs trilinear self-coupling is one of the next major challenges of particle physics. While prospects from proton-proton collisions have centred around measuring the on-shell single- and di-Higgs production processes, the off-shell Higgs production process has also been suggested as a complementary channel to resolve the degeneracy in Higgs couplings. We...
Measurements and observations in Particle Physics fundamentally depend on one's ability to quantify their uncertainty and, thereby, their significance. Therefore, as Machine Learning methods become more prevalent in HEP, being able to determine the uncertainties of an ML method becomes more important. A wide range of possible approaches has been proposed, however, there has not been a...
In this talk, we present a dedicated graph neural network (GNN)-based methodology for the extraction of the Higgs boson signal strength, incorporating systematic uncertainties. The model features two branches: a deterministic GNN that processes kinematic variables unaffected by nuisance parameters, and an uncertainty-aware GNN that handles inputs modulated by systematic effects through gated...
With the discovery of the Standard Model (SM) Higgs boson (H) by the CMS and ATLAS experiments in 2012, the last missing elementary particle predicted by the SM was identified. Since then, extensive measurements of the Higgs boson’s properties have been performed across various decay channels. One of the most important is its decay into a pair of tau leptons, the second-most frequent fermionic...
Neural Simulation-Based Inference (NSBI) is a powerful class of machine learning (ML)-based methods for statistical inference that naturally handle high dimensional parameter estimation without the need to bin data into low-dimensional summary histograms. Such methods are promising for a range of measurements at the Large Hadron Collider, where no single observable may be optimal to scan over...
I present a new method for unbinned cross-section measurements and related inference problems at the LHC.
The new methodology revolves around 'refinable' machine learning of various model parameter dependencies with a particular focus on systematic effects. It shows significant performance gains in concrete applications. I will illuminate the general methodology for two realistic cases: An...