Description
Conveners:
- Thea Aarrestad (ETH Zurich)
- Dorothea vom Bruch (Aix Marseille Univ, CNRS/IN2P3, CPPM, Marseille)
Contact: eps-hep2025-conveners-T12-l@in2p3.fr
-
Dorothea vom Bruch (Aix Marseille Univ, CNRS/IN2P3, CPPM, Marseille, France)07/07/2025, 14:00T12 - Data Handling and ComputingParallel
Since the beginning of Run 3 of LHC the upgraded LHCb experiment is using a triggerless readout system collecting data at an event rate of 30 MHz and a data rate of 4 TB/s. The trigger system is split into two high-level trigger (HLT) stages. During the first stage (HLT1), implemented on GPGPUs, track reconstruction and vertex fitting for charged particles is performed to reduce the event rate...
Go to contribution page -
Philipp Nattland (RWTH Aachen University)07/07/2025, 14:20T12 - Data Handling and ComputingParallel
During LHC Run 3, the CMS experiment faced challenging pileup and high event rate conditions. To efficiently select events of interest for physics analysis or alignment and calibrations, the CMS collaboration utilises a two-tiered triggering system. This system consists of a firmware-based Level-1 Trigger (L1) and a software-based High Level Trigger (HLT) that runs in a computing farm. The L1...
Go to contribution page -
biao zhang (Heidelberg university(DE))07/07/2025, 14:40T12 - Data Handling and ComputingParallel
During the LHC Run 3 data taking period, ALICE is reading out a factor of 600 more protonโproton collisions compared to Run 2, generating a data stream to the CERN T0 of over 30 GB/s โ the highest among all LHC experiments. This dramatic increase was made possible through major upgrades to both the detector systems and the underlying data processing infrastructure.
The full data stream is...
Go to contribution page -
Chihiro Yamada (Osaka University)07/07/2025, 15:00T12 - Data Handling and ComputingParallel
The COMET experiment searches for the coherent neutrinoless conversion of a muon to an electron in an aluminum atomic nucleus, a charged Lepton Flavor Violation process forbidden in the Standard Model. The experiment proceeds in two phases, with Phase-I aiming for a single event sensitivity of $3ร10^{-15}$โimproving the current limit by a factor of 100โusing a high-intensity proton beam to...
Go to contribution page -
Antonio De Maria (Nanjing University)07/07/2025, 15:20T12 - Data Handling and ComputingParallel
The ATLAS experiment in the LHC Run 3 uses a two-level trigger system to select events of interest to reduce the 40 MHz bunch crossing rate to a recorded rate of up to 3 kHz of fully-built physics events. The trigger system is composed of a hardware based Level-1 trigger and a software based High Level Trigger. The selection of events by the High Level Trigger is based on a wide variety of...
Go to contribution page -
Tomoyuki Saito (The University of Tokyo)07/07/2025, 15:40T12 - Data Handling and ComputingParallel
The Level-1 muon endcap trigger in the ATLAS experiment utilises signals from
Go to contribution page
the Thin Gap Chambers (TGCs) located in the outer muon stations. A significant
challenge for this system has been the high background rate caused by particles
not originating at the interaction point, which increased the Level-1 trigger
rate. To address this issue, the New Small Wheel (NSW) detectors, installed... -
Miguel Ruiz Diaz08/07/2025, 16:30T12 - Data Handling and ComputingParallel
The LHCb detector has undergone a major upgrade for Run 3 of the Large Hadron Collider (LHC) to take data at a nominal instantaneous luminosity increased by approximately a factor of five. A key component of this upgrade concerns the realization of a fully software-based trigger system that performs the reconstruction of tracks and particle candidates in real time, which can directly be used...
Go to contribution page -
Giovanni Cavallero (INFN Ferrara)08/07/2025, 16:50T12 - Data Handling and ComputingParallel
The LHCb experiment underwent a major upgrade in LHC Long Shutdown 2 and has been taking data in Run 3 at a five times higher instantaneous luminosity of 2 $\times$ 10$^{33}$ cm$^{-2}$ s$^{-1}$. The tracking detectors are all newly constructed and the particle identification detectors have been substantially upgraded with new frontend and backend electronics, allowing for the lowest level...
Go to contribution page -
Jiri Masik (The University of Manchester (GB))08/07/2025, 17:10T12 - Data Handling and ComputingParallel
The ATLAS experiment at CERN is constructing upgraded system for the "High Luminosity LHC", with collisions due to start in 2030. In order to deliver an order of magnitude more data than previous LHC runs, 14 TeV protons will collide with an instantaneous luminosity of up to 7.5 x 10e^(34) cm^(-2)s^(-1), resulting in much higher pileup and data rates than the current experiment was designed to...
Go to contribution page -
Ralf Gugel (JGU Mainz)08/07/2025, 17:30T12 - Data Handling and ComputingParallel
The ATLAS level-1 calorimeter trigger is a custom-built hardware system that identifies events containing calorimeter-based physics objects, including electrons, photons, taus, jets, and total and missing transverse energy.
In Run 3, L1Calo has been upgraded to process higher granularity input data. The new trigger comprises several FPGA-based feature extractor modules, which process the...
Go to contribution page -
Iacopo Longarini (University of California Irvine (US))08/07/2025, 17:50T12 - Data Handling and ComputingParallel
The Monitored Drift Tube Trigger Processor (MDT-TP) will improve the rate capabilities of the first-level muon (L0 Muon) trigger of the ATLAS Experiment during the operation of the HL-LHC.
Go to contribution page
The information of the trigger candidate, obtained by other muon trigger subsystems, will be combined with the precision of the MDT chambers in order to improve the resolution on the muon momentum... -
RUBEN FORTI08/07/2025, 18:10T12 - Data Handling and ComputingParallel
The precise reconstruction of charged particle tracks is crucial for the overall performance of the CMS experiment. In this contribution, performance measurements of the track reconstruction both in simulation and data will be presented, from the collisions occurred during the last periods of the Run 3 of data taking at the LHC. A particular focus will be given to the role and performance of...
Go to contribution page -
Antonio Perez-Calero Yzquierdo (CIEMAT)10/07/2025, 08:30T12 - Data Handling and ComputingParallel
With the approaching High Luminosity phase of the LHC programme, scheduled to start in 2030, the Offline Software and Computing group of the CMS collaboration is reviewing the experimentโs computing model to ensure its readiness for the computing challenges the HL-LHC poses. An in-depth revision of the current model, tools and practices is being carried out, along with a programme of R&D...
Go to contribution page -
Saswati Nandan10/07/2025, 08:50T12 - Data Handling and ComputingParallel
Reducing event and data sizes is critical for experiments at the LHC, where high collision rates and increased detector granularity rapidly increase storage and processing requirements. In the CMS experiment, a recent development to address this challenge is the โRawโโ format: a new approach for recording silicon strip data in which only the reconstructed clusterโs barycenter and average...
Go to contribution page -
Giovanni Gaudino (SSM - INFN Napoli)10/07/2025, 09:10T12 - Data Handling and ComputingParallel
The Belle II experiment at the SuperKEKB accelerator in Tsukuba, Japan, searches for physics beyond the Standard Model, with a focus on precise measurements of flavor physics observables. Highly accurate Monte Carlo simulations are essential for this endeavor, as they must correctly model the variations in detector conditions and beam backgrounds that occur during data collection. To meet this...
Go to contribution page -
Kenneth Long (Massachusetts Institute of Technology (MIT))10/07/2025, 09:30T12 - Data Handling and ComputingParallel
The unprecedented volume of data and Monte Carlo simulations at the HL-LHC poses increasing challenges for particle physics analyses, demanding computation-efficient analysis workflows and reduced time to insight. The recent W mass measurement by CMS exemplifies these challenges and demonstrates the application of cutting-edge techniques essential for future analyses. We present a...
Go to contribution page -
Anna Sinopoulou (INFN - Laboratori Nazionali del Sud)10/07/2025, 09:50T12 - Data Handling and ComputingParallel
The KM3NeT collaboration is constructing two cutting-edge underwater neutrino detectors in the Mediterranean Sea: ARCA, which is optimized for the detection of astrophysical neutrinos, and ORCA, which aims to determine the neutrino mass hierarchy via the observation of atmo-
Go to contribution page
spheric neutrinos. The increasing size of the detectors results in significant data volumes, requiring effective data... -
Ulrich Schwickerath (CERN)10/07/2025, 10:10T12 - Data Handling and ComputingParallel
Data preservation is essential for present and future experimental facilities, enabling cost-effective fundamental research by leveraging unique data sets as theoretical and experimental understanding advances. This contribution summarizes the status of data preservation in high energy physics from a perspective of 15 years of experience with a structured collaborative effort at international...
Go to contribution page