Speaker
Huilin Qu
(CERN)
Description
A rising paradigm in AI in recent years is the foundation model, which refers to a model trained on broad data and adaptable to a wide range of downstream tasks. In this work, we present a new approach to learning powerful jet representations directly from unlabelled data. The method employs a Particle Transformer to predict masked particle representations in a latent space, overcoming the need for discrete tokenization and enabling it to extend to arbitrary input features beyond the Lorentz four-vectors. We demonstrate the effectiveness and flexibility of this method in several downstream tasks, including jet tagging and anomaly detection. Our approach provides a new possible path towards a foundation model for jet physics.
Authors
Dr
Congqiao Li
(Peking University (CN))
Huilin Qu
(CERN)
Dr
Qibin Liu
(Tsung-Dao Lee Institute (CN) & Shanghai Jiao Tong University (CN))
Mr
Shudong Wang
(Chinese Academy of Sciences (CN))