6–11 Jul 2025
PALAIS DU PHARO, Marseille, France
Europe/Paris timezone

EveNet: Towards a Generalist Event Transformer for Unified Understanding and Generation of Collider Data

10 Jul 2025, 09:10
20m
PALAIS DU PHARO, Marseille, France

PALAIS DU PHARO, Marseille, France

Parallel T16 - AI for HEP (special topic 2025) T16

Speaker

Yulei Zhang (University of Washington)

Description

With the increasing size of the machine learning (ML) model and vast datasets, the foundation model has transformed how we apply ML to solve real-world problems. Multimodal language models like chatGPT and Llama have expanded their capability to specialized tasks with common pre-train. Similarly, in high-energy physics (HEP), common tasks in the analysis face recurring challenges that demand scalable, data-driven solutions. In this talk, we present a foundation model for high-energy physics. Our model leverages extensive simulated datasets in pre-training to address common tasks across analyses, offering a unified starting point for specialized applications. We demonstrate the benefit of using such a pre-train model in improving search sensitivity, anomaly detection, event reconstruction, feature generation, and beyond. By harnessing the power of pre-trained models, we could push the boundaries of discovery with greater efficiency and insight.

Authors

Mr Bai-Hong Zhou (Tsung-Dao Lee Institute) Benjamin Nachman (LBNL) Dr Haoran Zhao (University of Washington) Qibin Liu (Tsung-Dao Lee Institute (CN) & Shanghai Jiao Tong University (CN)) Prof. Shih-Chieh Hsu (University of Washington) Prof. Shu Li (Tsung-Dao Lee Institute) Mr Ting-Hsiang Hsu (National Taiwan University) Vinicius Mikuni (Lawrence Berkeley National Laboratory) Mr Wang Wei-Po Yuan-Tang Chou (University of Washington, Seattle (US)) Dr Yue Xu (University of Washington) Yulei Zhang (University of Washington)

Presentation materials

There are no materials yet.