22–23 janv. 2020
CC-IN2P3
Fuseau horaire Europe/Paris

Application of Bayesian Convolutional Neural Network to spectral identification of radionuclides for nuclear monitoring

23 janv. 2020, 09:00
25m
Amphi (CC-IN2P3)

Amphi

CC-IN2P3

21 avenue Pierre de Coubertin CS70202 69627 Villeurbanne cedex
ML algorithms : Machine Learning development across applications

Orateur

Geoffrey Daniel (CEA/Irfu/DAp)

Description

We apply artificial neural networks to spectral identification of radionuclides for nuclear monitoring of unknown scenes, which requires a fast and fully automatic remote sensing analysis.
We propose a new Bayesian Convolutional Neural Network (BCNN) architecture able to perform fast identification of radionuclides whose signal is recorded in a spectrum by means of a compact CdTe based imaging spectrometer, namely Caliste. Our BCNN is able to identify the presence of a source in a mixture of sources, with high precision even in a low counting statistics regime. In addition, we developed a process to estimate the uncertainty of the output of the algorithm. The learning database has been generated by using only synthetic datasets, simulated thanks to Geant4 and convolved with a simplified model of the response of the detector, allowing the detection of sources that the detector has never measured before.
We evaluated the performances of the synthetically trained network on real data registered in the lab with calibration sources. We recorded the event list with Caliste, developed by the astrophysics department of the CEA. Caliste is a 256 pixelated detector with 625 µm pitch. Each channel is an independent spectrometer and the data to be analysed are the calibrated sum of all single events recorded in all the channels. Caliste has an energy resolution of ~1% FWHM at 60 keV and 662 keV and operates in the range from 2 keV up to 1 MeV. We report on the precision and recall of our algorithm and demonstrate its capability of radionuclide identification. We applied the process to real-time measurements, running permanently our neural networks on the last acquired photons, illustrating its performance in case of moving sources or moving detector as this is relevant to robotic exploration of post-accidental scenes.

Author

Geoffrey Daniel (CEA/Irfu/DAp)

Co-auteurs

Documents de présentation