Within the framework of the Local Analytic Sector Subtraction we briefly present the method for removing infrared singularities at Next-to-Leading Order (NLO) in QCD for processes involving massless coloured particles either in the initial or in the final state. We present an extension of Local Analytic Sector Subtraction to the case involving massive emitter. This process also allows us to...
I present a probabilistically founded definition of theory uncertainties in perturbative computations due to the unknown higher orders. I show its performance against canonical recipes such as scale variation. I finally discuss future directions.
The need of percent precision in high energy physics requires the inclusion of QED effects in theoretical predictions, for example like the contributions coming from photon initiated processes. It is trivial then, to correctly determine the photon content of the proton.
In this work, we extend the NNPDF4.0 NNLO determination of parton distribution functions (PDFs) with a photon PDF,...
We include uncertainties due to missing higher order corrections to QCD computations (MHOU) used in the determination of parton distributions (PDFs) in the recent NNPDF4.0 set of PDFs. We use our previously published methodology, based on the treatment of MHOUs and their full correlations through a theory covariance matrix determined by scale variation, now fully incorporated in the new NNPDF...
We investigate the impact of theory uncertainties on a global EDM analysis in the low-energy sector. For this analysis, we employ SFitter as our tool of choice. In contrast to previous analyses, in the EDM sector, theory uncertainties are heavily contingent upon the model parameters and thus cannot be disentangled from the prediction as readily as for SMEFT global analyses.
Off-shell effects in large LHC backgrounds are crucial for precision predictions and, at the same time, challenging to simulate. We show how a generative diffusion network learns off-shell kinematics given the much simpler on-shell process. It generates off-shell configurations fast and precisely, while reproducing even challenging on-shell features.
Unfolding is a transformative method that is key to analyze LHC data. More recently, modern machine learning tools enable its implementation in an unbinned and high-dimensional manner. The basic techniques to perform unfolding include event reweighting, direct mapping between distributions and conditional phase space sampling, each of them providing a way to unfold LHC data accounting for all...
Deep generative models have emerged as a powerful paradigm for enhancing and maximising the potential for discovery at collider experiments. They can be deployed for multiple tasks, including fast simulations, data augmentation and anomaly detection. As novel methods continue to be developed, there is a pressing need to advance techniques for model selection and evaluation, particularly in...
Nowadays, the research in Beyond Standard Model (BSM) scenarios aimed at describing the nature of dark matter is a very active field. DarkPACK is a recently released software conceived to help to study such models. It can already compute the relic density in the freeze-out scenario, and its potential can be used to compute other observables.