Speaker
Theo Heimel
(Heidelberg)
Description
Theory predictions for the LHC require precise numerical phase-space integration and generation of unweighted events. We combine machine-learned multi-channel weights with a normalizing flow for importance sampling to improve classical methods for numerical integration. By integrating buffered training for potentially expensive integrands, VEGAS initialization, symmetry-aware channels, and stratified training, we elevate the performance in both efficiency and accuracy. Further, we show how differential programming techniques can boost the performance of current and planned MadGraph implementations. We empirically validate these enhancements through rigorous tests on diverse LHC processes.