Speaker
Description
With the approaching High Luminosity phase of the LHC programme, scheduled to start in 2030, the Offline Software and Computing group of the CMS collaboration is reviewing the experiment’s computing model to ensure its readiness for the computing challenges the HL-LHC poses. An in-depth revision of the current model, tools and practices is being carried out, along with a programme of R&D activities to evolve them, with the goal of effectively managing the available computing resources towards a successful exploitation of the data abundance provided by the HL-LHC.
This evolution includes a review of the data management and workflow management systems, to ensure their scalability to the growing data volumes and level of resources required to store, process and analyze them. Larger and more heterogeneous computing resources, both from grid and high-performance computing centers, are being integrated to the global CMS computing pool. Critical progress is being made also in the adoption of new data analysis infrastructures and paradigms. Additionally, further integration of novel technologies in multiple computing areas is being explored. This contribution will present the status of the CMS computing model evolution in the aforementioned key areas, among others.