.
The physics reach of the HL-LHC will be limited by how efficiently the experiments can use the available computing resources, i.e. affordable software and computing are essential. The development of novel methods for charged particle reconstruction at the HL-LHC incorporating machine learning techniques or based entirely on machine learning is a vibrant area of research. In the past two years,...
Because of the nature of QCD interactions with matter, the measured energies and masses of hadronic jets have to be calibrated before they are used in physics analysis. The correction depends on many characteristics of the jets, including the energy and mass themselves. Obtaining the correction is thus a multidimensionnal regression problem for wich DNN is a well suited approach.
In...
Reconstruction of di-$\tau$ mass in a faster and more accurate way than the existing methods is crucial to test any theory involving Higgs boson and Z boson which are decaying to $\tau^+ \tau^-$. However, it is an arduous task due to existence of neutrinos as decay product of each $\tau$ lepton which are invisible to detectors at LHC.
The present ongoing work aims at obtaining a di-$\tau$...
Machine Learning methods are extremely powerful but often function as black-box problem solvers, providing improved performance at the expense of clarity. Our work describes a new machine learning approach which translates the strategy of a deep neural network into simple functions that are meaningful and intelligible to the physicist, without sacrificing performance improvements. We apply...
Among all of the applications of Machine Learning in HEP, anomaly detection
methods have been receiving a growing interest over the last years. Their use
is especially promising in the development of model independent search tech-
niques. Following this trend line, we propose new algorithms based on the arti-
ficial neural network concept of the Auto-Encoder, augmented with...
Within the Phase-II upgrade of the LHC, the readout electronics of the ATLAS Liquid Argon (LAr) Calorimeters is prepared for high luminosity operation expecting a pile-up of up to 200 simultaneous pp interactions.
The Liquid Argon (LAr) calorimeters measure the energy of particles produced by LHC collisions, especially electrons and photons. The digitized signals from the LAr 182468...
Within the Phase-II upgrade of the LHC, the readout electronics of the ATLAS Liquid Argon (LAr) Calorimeters is prepared for high luminosity operation expecting a pile-up of up to 200 simultaneous pp interactions. The Liquid Argon calorimeters measure the energy of particles produced by LHC collisions, especially electrons and photons. The digitized signals from the LAr 182468 channels are...
Next generation experiments such as the Vera Rubin Observatory Legacy Survey of Space and Time (LSST) will provide an unprecedented volume of time-domain data opening a new era of big data in astronomy. To fully harness the power of these surveys, we require analysis methods capable of dealing with large data volumes that can identify promising transients within minutes for follow-up...
On se propose de faire un point sur les GPUs disponibles au CC-IN2P3 et leur utilisation actuelle. On présentera également quelques évolutions à venir.
The 2021 edition of the School of Statistics SOS2021 was held online for the first time (postponed from May 2020 in Carry-le-Rouet) from 18 to 29 January 2021. The school targets PHD students, post-docs and senior scientists wishing to strengthen their knowledge or discover new methods in statistical analysis applied in particle and astroparticle physics and cosmology.
The programme covers...
I will present a first investigation of the suitability and performance of IPUs in deep learning applications in cosmology.
As upcoming photometric galaxy surveys will produce an unprecedented amount of observational data, more and more people turn to deep learning for fast and accurate data processing. In this work I tested typical examples of tasks that will be required to process and...
The neutrino telescopes KM3NeT search for cosmic neutrinos from distant
astrophysical sources such as supernovae, gamma ray bursters or
colliding stars flaring blazars. Once the events are received, they are
rapidly reconstructed online. The online events must be classified to
identify signal neutrinos from atmospheric muon background events.
Dedicated applications will then analyse...
The localization of radioactive sources provides mandatory information for the monitoring and the diagnostic of radiological scenes and it still constitutes a critical challenge. Gamma-ray imaging is performed through coded mask aperture imaging when the energy of the photons is sufficiently low to insure photoelectric interactions into the mask. Then, classically, a deconvolution algorithm is...
Currently, dynamic aperture calculations of high-energy hadron colliders are
generated through computer simulation, which is both a resource-heavy and
time-costly process.
The aim of this research is to use a reservoir computing machine learning
model in order to achieve a faster extrapolation of dynamic aperture values. In
order to achieve these results, a recurrent echo-state network...
The Cherenkov Telecope Array (CTA) is the future of ground-based gamma astronomy and will be composed of tens of telescopes divided in two arrays in both hemispheres.
GammaLearn is a project started in 2017 to develop innovative analysis for CTA event reconstruction based on deep learning.
Here we present a status report of the project, the network architecture developed for event...
We present a novel methodology to address ill-posed inverse problems, by providing a description of the posterior distribution instead of a point estimate solution. Our approach combines Neural Score Matching for learning a prior distribution from physical simulations, and an Annealed Hamiltonian Monte-Carlo technique to sample the full high-dimensional posterior of our problem.
In the...
Weak gravitational lensing is one of the most promising tools of cosmology to constrain models and probe the evolution of dark-matter structures. Yet, the current analysis techniques are only able to exploit the 2-pt statistics of the lensing signal, ignoring a large fraction of the cosmological information contained in the non-Gaussian part of the signal. Exactly how much information is lost,...
Telescope images are corrupted with blur and noise. Generally, blur is represented by a convolution with a Point Spread Function and noise is modelled as Additive Gaussian Noise. Restoring galaxy images from the observations is an inverse problem that is ill-posed and specifically ill-conditioned. The majority of the standard reconstruction methods minimise the Mean Square Error to...