Machine learning has been an integral part of scientific research for several decades. However, the rapid expansion of generative artificial intelligence (AI) in all fields in recent years has placed AI at the center of strategic discussions at both the national and international levels. Among the six interdisciplinary challenges identified by the CNRS, one focuses specifically on the role of...
Depuis une dizaine d’années, les mésocentres ont vu émerger une nouvelle communauté d’utilisateurs autour de l’intelligence artificielle et du Deep Learning, portée initialement par les laboratoires d’informatique.
Cette évolution a été stimulée par la montée en puissance des GPU, ouvrant la voie à de nouveaux usages sur les infrastructures HPC régionales.
Aujourd'hui, le Machine Learning...
- La problématique du calcul et des données
- La connexion entre les centres de calculs
- La stratégie nationale et européenne
Accurate energy calibration of calorimeters is essential for the physics goals of collider experiments, particularly at the CERN Large Hadron Collider. Conventional calibration strategies encounter growing limitations as calorimeter granularity increases. We propose a novel calibration method that simultaneously calibrates individual detector cells within a particle shower by targeting a...
Within the CTAO collaboration, GammaLearn is a project to develop deep learning solutions for the event reconstruction of Imaging Atmospheric Cherenkov Telescopes directly from the acquired images. Previous work demonstrated very good performances of the developed architecture network ganma-PhysNet on simulated and real data in constrained conditions. However, image acquisition covers...
Searching for new physics often requires testing many different signal hypotheses across an extensive parameter space, such as signal mass or width. Traditional approaches typically involves the training of one classifier per hypothesis, which quickly becomes impractical when scanning over a broad range of parameters. At higher masses, where event yields are low, limited training data leads to...
Inverse problems in astronomy are often computationally expensive, and Markov Chain Monte Carlo (MCMC) routines become impractical for massive, heterogeneous datasets or when only limited data are available. With the advent of the Vera C. Rubin Observatory Legacy Survey of Space and Time (LSST), the astronomical community will receive millions of transient alerts daily, many of which evolve on...
Maximizing the scientific discovery rate in complex modern experiments demands advanced data analysis and real-time control. We present a suite of recently developed Artificial Intelligence (AI) and Machine Learning (ML) applications that can transform the precision and efficiency of experimental work at GANIL.
Our efforts are focused on two critical areas:
- Deep Neural Networks...
We present a hybrid machine-learning framework that combines high-accuracy numerical regression with symbolic regression to model and interpret nuclear charge radii. Using Light Gradient Boosting and Gaussian Process Regression with rigorous cross-validation, the method reproduces experimental trends across the nuclear chart and distills them into simple analytical expressions. These formulas...
The SVOM satellite mission, launched in June 2024 is dedicated to Gamma-Ray Burst (GRBs) studies. The ECLAIRs trigger onboard SVOM, which reorients the satellite for GRB follow-up observations, also provides a real-time Alert Sequence for each detected GRB, transmitted to ground over the SVOM VHF receiver network. One of the two trigger algorithms, the Image Trigger (IMT) transmits at the end...
The rapid advancements in deep learning, particularly in computer vision, present significant opportunities for industrial innovation. However, many companies face substantial barriers to entry, including the high cost of computational resources and a shortage of specialized machine learning talent. The IDEFICS project aims to address these challenges by providing a comprehensive framework...
Artificial intelligence (AI) is emerging today as a major driver of innovation for research infrastructures. In the field of particle accelerators, it opens up unprecedented opportunities for real-time control, predictive diagnostics, beam optimization, and the design of new devices. This presentation will provide an overview of current AI approaches and applications in accelerator science,...
The next generation of gravitational wave observatories—the Einstein Telescope (ET), Cosmic Explorer (CE), and LISA—will revolutionize astrophysics but present unprecedented data analysis challenges. LISA will detect tens of thousands of overlapping signals requiring simultaneous inference in high-dimensional Bayesian settings, while ground-based detectors will face thousands of overlapping...
The High-Luminosity LHC (HL-LHC) will provide unprecedented opportunities for precision measurements and new physics searches, but it will also bring extreme challenges for event reconstruction in the dense pile-up environment. To meet these challenges, the CMS detector is undergoing major upgrades, including the replacement of its endcap calorimeters with the High-Granularity Calorimeter...
We showcase recent advancements from the magnet laboratory at CEA/IRFU in designing superconducting magnets.
This competition in high-energy physics (HEP) and machine learning was the first to strongly emphasise uncertainties in $(H \rightarrow \tau^+ \tau^-)$ cross-section measurement. Participants were tasked with developing advanced analysis techniques capable of dealing with uncertainties in the input training data and providing credible confidence intervals. The accuracy of these intervals was...
The development of innovative methods for fission trigger construction addresses the challenge of recognising fission signature in very complex detector’s response functions.
The fission recognition approaches available today have intrinsic limitations.
To draw a clearer picture, the existing dedicated detectors for fission triggering present constraints regarding experimental setup geometry...
A travers la R&T THINK, nous essayons de créer des modèles pouvant remplacer le traitement classique du signal au plus près des détecteurs. Je présenterais la problématique des modèles dans l'embarqué. Puis, je presenterai la problématique du signal de R2D2 pour la recherche sur la nature des neutrino. Je presenterai alors un modèle de déconvolution de la chaine électronique pour retrouver la...
As part of its upgrade for the High-Luminosity LHC (HL-LHC) , the CMS experiment is deploying a novel High Granularity Calorimeter (HGCAL) in the endcap regions. Designed with fine segmentation in both longitudinal and transverse directions, HGCAL will be the first calorimeter specifically optimised for particle-flow reconstruction to operate at a colliding-beam experiment. The calorimeter...
The Large Hadron Collider (LHC) collides protons at nearly the speed of light, producing new particles observed by the ATLAS detector. In 2026, the LHC will undergo a major upgrade to the High-Luminosity LHC (HL-LHC), increasing luminosity by a factor of 5–7 and delivering up to 200 simultaneous collisions. To cope with the resulting data rates, ATLAS will replace the readout electronics of...
Collider rings need to have several sensors all around the ring to operate. One of these sensors is the Beam Position Monitor (BPM), which allows operators to measure if the beam travelling in the ring is well centered between the different magnets. One specific category of BPMs, which stands out because of its high acquisition rate, is called the Turn-by-turn BPMs (TbTBPMs). Several methods...
From Prototyping to Production: Integrating and Scaling GNN Tracking for the HL-LHC within the ATLAS Software Framework
The High-Luminosity LHC (HL-LHC) upgrade of the ATLAS Inner Tracker (ITk) presents unprecedented challenges for track reconstruction, driven by the large number of silicon cluster readouts and the high throughput required under tight computing constraints. Graph Neural...
Cosmological research in the era of deep, wide-area surveys such as Euclid and Rubin/LSST benefits greatly from combining datasets collected with different instruments. However, the large volume of data makes analysis increasingly challenging. To address this, we developed a package based on the Variational Autoencoder (VAE) architecture that enables compact representations of spectroscopic...
Mismodeling of calorimeter shower shape observables has been present since the beginning of the ATLAS detector due to a mismodelling in the Geant4 detector simulation. Shower shape variables are discriminating observables used in the identification of electrons and photons, and accurate modelling of their distributions is essential for precision measurements and searches in high-energy...
Turn-by-turn beam position monitor (BPM) data are vital for fast optics diagnostics in modern colliders, but they are often degraded by noise, spikes, and signal dropouts. We present ongoing work on a dual-decoder convolutional autoencoder that addresses these issues in an unsupervised setting. A shared encoder compresses BPM waveforms into a latent representation. Two decoders then serve...
The interTwin project develops an open-source Digital Twin Engine to integrate application-specific Digital Twins (DTs) across scientific domains. Its framework for the development of DTs supports interoperability, performance, portability and accuracy. As part of this initiative, we implemented the CaloINN normalizing-flow model for calorimeter simulations within the interTwin framework....
The Interest Public Group ARRONAX's C70XP cyclotron, used for radioisotope production for medical and research applications, relies on complex and costly systems that are prone to failures, leading to operational disruptions. In this context, research is being conducted to develop an active machine learning method for early anomaly detection to enhance system performance. One of the most...
Advancements in geometric deep learning offer powerful tools to study the internal structure of jets initiated by heavy quarks, particularly in the context of dead-cone effect and jet quenching. The kinematics of b-hadron decays present a challenge for substructure measurements with inclusive b-jets, which are essential for quantum chromodynamics (QCD) studies. We propose an approach using...
Exploring exotic Higgs boson decays often requires access to challenging regions of phase space where standard reconstruction techniques become limited, making potential signals effectively invisible. In particular, when decay products become highly collimated and overlap in the calorimeter, conventional algorithms lose sensitivity. We present a novel machine learning based technique for...
The Cherenkov Telescope Array Observatory (CTAO) is an international observatory currently under construction, which will consist of two sites (one in the Northern Hemisphere and one in the Southern Hemisphere). It will eventually be the largest and most sensitive ground-based gamma-ray observatory. In the meantime, a small subarray composed of four Large-Sized Telescopes (LSTs) at the...
In high-energy collisions, jets, which are collimated sprays of particles, can originate from various fundamental particles, including W and Z bosons, top quarks, and the Higgs boson. Accurately identifying these jets is crucial for studying Standard Model processes and investigating new physics beyond its framework. This study, conducted within the ATLAS collaboration at the Large Hadron...
KM3NeT is a new research infrastructure housing the next generation of neutrino telescopes in the Mediterranean deep sea. This facility comprises two detectors: KM3NeT/ARCA and KM3NeT/ORCA, consisting of vertically-arranged detection units, 230 and 115, respectively, each equipped with 18 digital optical modules. The photomultipliers within each optical module detect Cherenkov light emitted by...
The Large Hadron Collider (LHC) is designed to probe the limits of the Standard Model and search for new phenomena. Machine Learning (ML) has become a powerful tool in this endeavor, particularly for jet tagging tasks. Large-scale datasets such as JetClass, which contains over 100 million simulated jet events, enable not only the training of supervised models but also the development of...