Since the beginning of Run 3 of LHC the upgraded LHCb experiment is using a triggerless readout system collecting data at an event rate of 30 MHz and a data rate of 4 TB/s. The trigger system is split into two high-level trigger (HLT) stages. During the first stage (HLT1), implemented on GPGPUs, track reconstruction and vertex fitting for charged particles is performed to reduce the event rate...
During LHC Run 3, the CMS experiment faced challenging pileup and high event rate conditions. To efficiently select events of interest for physics analysis or alignment and calibrations, the CMS collaboration utilises a two-tiered triggering system. This system consists of a firmware-based Level-1 Trigger (L1) and a software-based High Level Trigger (HLT) that runs in a computing farm. The L1...
During the LHC Run 3 data taking period, ALICE is reading out a factor of 600 more proton–proton collisions compared to Run 2, generating a data stream to the CERN T0 of over 30 GB/s — the highest among all LHC experiments. This dramatic increase was made possible through major upgrades to both the detector systems and the underlying data processing infrastructure.
The full data stream is...
The COMET experiment aims to search for the coherent neutrinoless conversion of a muon to an electron in an aluminum atomic nucleus, one of the processes of charged Lepton Flavor Violation, which has never been observed. The experiment is being conducted in two phases. The first phase, Phase-I, targets a single event sensitivity of $3×10^{−15}$, an improvement by a factor of 100 over the...
The ATLAS experiment in the LHC Run 3 uses a two-level trigger system to select events of interest to reduce the 40 MHz bunch crossing rate to a recorded rate of up to 3 kHz of fully-built physics events. The trigger system is composed of a hardware based Level-1 trigger and a software based High Level Trigger. The selection of events by the High Level Trigger is based on a wide variety of...
The Level-1 muon endcap trigger in the ATLAS experiment utilises signals from
the Thin Gap Chambers (TGCs) located in the outer muon stations. A significant
challenge for this system has been the high background rate caused by particles
not originating at the interaction point, which increased the Level-1 trigger
rate. To address this issue, the New Small Wheel (NSW) detectors, installed...
The LHCb detector has undergone a major upgrade for Run 3 of the Large Hadron Collider (LHC) to take data at a nominal instantaneous luminosity increased by approximately a factor of five. A key component of this upgrade concerns the realization of a fully software-based trigger system that performs the reconstruction of tracks and particle candidates in real time, which can directly be used...
The LHCb experiment underwent a major upgrade in LHC Long Shutdown 2 and has been taking data in Run 3 at a five times higher instantaneous luminosity of 2 $\times$ 10$^{33}$ cm$^{-2}$ s$^{-1}$. The tracking detectors are all newly constructed and the particle identification detectors have been substantially upgraded with new frontend and backend electronics, allowing for the lowest level...
The ATLAS experiment at CERN is constructing upgraded system for the "High Luminosity LHC", with collisions due to start in 2030. In order to deliver an order of magnitude more data than previous LHC runs, 14 TeV protons will collide with an instantaneous luminosity of up to 7.5 x 10e^(34) cm^(-2)s^(-1), resulting in much higher pileup and data rates than the current experiment was designed to...
The ATLAS level-1 calorimeter trigger is a custom-built hardware system that identifies events containing calorimeter-based physics objects, including electrons, photons, taus, jets, and total and missing transverse energy.
In Run 3, L1Calo has been upgraded to process higher granularity input data. The new trigger comprises several FPGA-based feature extractor modules, which process the...
The Monitored Drift Tube Trigger Processor (MDT-TP) will improve the rate capabilities of the first-level muon (L0 Muon) trigger of the ATLAS Experiment during the operation of the HL-LHC.
The information of the trigger candidate, obtained by other muon trigger subsystems, will be combined with the precision of the MDT chambers in order to improve the resolution on the muon momentum...
With the approaching High Luminosity phase of the LHC programme, scheduled to start in 2030, the Offline Software and Computing group of the CMS collaboration is reviewing the experiment’s computing model to ensure its readiness for the computing challenges the HL-LHC poses. An in-depth revision of the current model, tools and practices is being carried out, along with a programme of R&D...
Reducing event and data sizes is critical for experiments at the LHC, where high collision rates and increased detector granularity rapidly increase storage and processing requirements. In the CMS experiment, a recent development to address this challenge is the “Raw’” format: a new approach for recording silicon strip data in which only the reconstructed cluster’s barycenter and average...
The Belle II experiment at the SuperKEKB accelerator in Tsukuba, Japan, searches for physics beyond the Standard Model, with a focus on precise measurements of flavor physics observables. Highly accurate Monte Carlo simulations are essential for this endeavor, as they must correctly model the variations in detector conditions and beam backgrounds that occur during data collection. To meet this...
The unprecedented volume of data and Monte Carlo simulations at the HL-LHC poses increasing challenges for particle physics analyses, demanding computation-efficient analysis workflows and reduced time to insight. The recent W mass measurement by CMS exemplifies these challenges and demonstrates the application of cutting-edge techniques essential for future analyses. We present a...
The KM3NeT collaboration is constructing two cutting-edge underwater neutrino detectors in the Mediterranean Sea: ARCA, which is optimized for the detection of astrophysical neutrinos, and ORCA, which aims to determine the neutrino mass hierarchy via the observation of atmo-
spheric neutrinos. The increasing size of the detectors results in significant data volumes, requiring effective data...
Data preservation is essential for present and future experimental facilities, enabling cost-effective fundamental research by leveraging unique data sets as theoretical and experimental understanding advances. This contribution summarizes the status of data preservation in high energy physics from a perspective of 15 years of experience with a structured collaborative effort at international...