Speaker
Description
Researches on nuclear reactors, both for optimization of current generation or for study of next generations, require simulations. Indeed, reactor operation parameters, fuel burning, waste production, etc. can be studied by simulation with Monte Carlo or deterministic codes. These codes simulate the fundamental interaction of nucleons or ions with the matter and use as inputs nuclear data like reaction cross sections, angular distributions, fission yields, decay information … These inputs are called evaluated nuclear data, they are compiled in evaluated nuclear databases and they are determined from experimental data and state-of-the art nuclear reaction codes. The increase in calculation power allows today precise sensitivity studies, which reveal that the one major limiting factor for accuracy simulations of reactor parameters is the accuracy of evaluated nuclear data used as inputs. The international community is thus continuously working on the improvement of evaluated nuclear data libraries like the European - JEFF (Joint Evaluated Fission and Fusion), the US - ENDF (Evaluated Nuclear Data File) or the Japanese - JENDL (Japanese Evaluated Nuclear Data Library), … The quality of evaluated nuclear data bases can be improved with efforts both from the experimental and theoretical sides as reliance on nuclear models is common today for nuclear data evaluation. In some cases, where experimental data are scarce or known with low precision, new measurements are mandatory to provide new and relevant constraints for nuclear modeling. Moreover, experimental integral data are also used in the evaluation cycle as validation. In this presentation, after the description of the context and issues of nuclear data for reactor physics, I will focus on the new challenges we have to face for microscopic experimental data (used for theoretical modeling improvement and evaluation) in the frame of the development of modern, high performance evaluated data bases.