Speaker
Description
With the increasing size of the machine learning (ML) model and vast datasets, the foundation model has transformed how we apply ML to solve real-world problems. Multimodal language models like chatGPT and Llama have expanded their capability to specialized tasks with common pre-train. Similarly, in high-energy physics (HEP), common tasks in the analysis face recurring challenges that demand scalable, data-driven solutions. In this talk, we present a foundation model for high-energy physics. Our model leverages extensive simulated datasets in pre-training to address common tasks across analyses, offering a unified starting point for specialized applications. We demonstrate the benefit of using such a pre-train model in improving search sensitivity, anomaly detection, event reconstruction, feature generation, and beyond. By harnessing the power of pre-trained models, we could push the boundaries of discovery with greater efficiency and insight.