6–11 Jul 2025
PALAIS DU PHARO, Marseille, France
Europe/Paris timezone

Low-latency Jet Tagging for HL-LHC Using Transformer Architectures

Not scheduled
20m
PALAIS DU PHARO, Marseille, France

PALAIS DU PHARO, Marseille, France

Parallel T16 - AI for HEP (special topic 2025) Joint T12+T16

Speaker

Lauri Laatu (Imperial College London)

Description

Transformers are the state-of-the-art model architectures and widely used in application areas of machine learning. However the performance of such architectures is less well explored in the ultra-low latency domains where deployment on FPGAs or ASICs is required. Such domains include the trigger and data acquisition systems of the LHC experiments.

We present a transformer-based algorithm for jet tagging built with the HGQ2 framework, which is able to produce a model with heterogeneous bitwidths for fast inference on FPGAs, as required in the trigger systems at the LHC experiments. The bitwidths are acquired during training by minimizing the total bit operations as an additional parameter. By allowing a bitwidth of zero, the model is pruned in-situ during training. Using this quantization-aware approach, our algorithm achieves state-of-the-art performance while also retaining permutation invariance which is a key property for particle physics applications

Due to the strength of transformers in representation learning, our work serves also as a stepping stone for the development of a larger foundation model for trigger applications.

Authors

Abhijith Gandrakota (Fermilab) Alexander Tapper (Imperial College London) Arianna Cox (Imperial College London) Benedikt Maier (Imperial College London) Chang Sun (Caltech) Filip Wojcicki (Imperial College London) Jennifer Ngadiuba (Fermilab) Lauri Laatu (Imperial College London) Zhiqiang Que (Imperial College London)

Presentation materials

There are no materials yet.