Paris workshop on Bayesian Deep Learning for Cosmology and Time Domain Astrophysics 3rd ed.
Buffon Amphitheater
Université Paris Cité
Machine learning attracts a lot of interest in the fields of cosmology and time-domain astronomy and may potentially lead to major breakthroughs. Its adoption by the scientific community has been increasing dramatically in the past few years. Current progress in the machine learning community can simultaneously bring a lot to ours and must be monitored closely.
Among developments of interest for the astronomy community, probabilistic machine learning models, especially Bayesian neural networks, bring an estimation of uncertainty and combine deep neural networks architecture with Bayesian inference.
Cosmologists also rely a lot on forward modelling and often face intractable likelihoods. Modern techniques like simulation-based inference and differentiable programming can be very valuable for model selection and model parameters inference.
Time series analyses have relied for some time on the use of recurrent neural networks, more recently on transformers. Some other analyses have been using convolutional neural networks, or graph neural networks. The workshop will explore if new architectures like graph transformers could be relevant in some fields, and how such networks can be made probabilistic.
This workshop will give the participants the opportunity to learn more about these emerging methods and how to use and exploit them in their research. The program includes invited lectures and tutorials from major computer science experts and invited and contributed talk, and poster sessions aimed at sharing experience between physicists on the practical applications of machine learning. It is intended for researchers and students that are familiar with machine learning, use this type of algorithms for their own work, and want to learn about the advanced techniques related to Bayesian deep learning.
Workshop program
- Tues 20 May – "School day" with 4 x 90 min interactive sessions
- Wed 21–Fri 23 May – Workshop with keynote speakers, contributed talks, tutorials, round tables and lightning talks.
Workshop topics
- Cosmology and time-domain astro applications of Bayesian deep networks
- Methods for quantifying models uncertainty
- Anomaly and outlier detection
- Simulation-based and likelihood-free inference
- Probabilistic ML frameworks
- Use of Bayesian deep learning outside of academia
- Ethics of large-scale machine learning
Important dates
- April 4th 2025 – pre-registration, registration and call for contributions start
- May 15th 2025 – contributions application deadline
- May 16th 2022 – end of registration
Confirmed speakers
- Federica Bianco (remote), University of Delaware, LSST Rubin TVS collaboration
- Samuel Farrens, Cosmostat AIM
- François Lanusse, Cosmostat AIM
- Konstantin Leyde, ICG Portsmouth
- Anaïs Möller (remote), Swinburne University of Technology
- Julien Peloton, IJCLab, FINK
- Justine Zeghal, UNiversité de Montréal, MILA
-
-
08:30
→
09:00
Workshop: Welcome to the school and registration Turing Amphitheater - Sophie Germain building
Turing Amphitheater - Sophie Germain building
Université Paris Cité
-
09:00
→
12:30
BDL School: morning session Turing Amphitheater - Sophie Germain building
Turing Amphitheater - Sophie Germain building
Université Paris Cité
Président de session: Alexandre Boucaud (APC / IN2P3)-
09:00
Foundation Models for Astronomy 1h 30m
Lecture
Orateur: Francois Lanusse ({CNRS}UMR7158) -
10:30
coffee break 30m
-
11:00
Density Estimation and Simulation Based Inference: introduction and applications to galaxy formation 1h 30m
Lecture
Orateur: Marc Huertas-Company (Université de Paris)
-
09:00
-
13:45
→
14:00
welcome back coffee 15m
-
14:00
→
17:15
BDL School: Afternoon session Turing Amphitheater - Sophie Germain building
Turing Amphitheater - Sophie Germain building
Université Paris Cité
Président de session: Cécile Roucelle (APC)-
14:00
Machine learning for time-domain astronomy at scale: the example of Fink 1h 30mOrateur: Dr Julien Peloton (CNRS-IJCLab)
-
15:30
break 15m
- 15:45
-
14:00
-
08:30
→
09:00
-
-
09:00
→
09:15
Workshop: Welcome, perspectives and outline of the workshop Buffon Amphitheater
Buffon Amphitheater
Université Paris Cité
15 rue Hélène Brion 75013 Paris -
09:15
→
10:15
Gravitational waves Buffon Amphitheater
Buffon Amphitheater
Université Paris Cité
15 rue Hélène Brion 75013 Paris-
09:15
Simulation-based inference for gravitational-wave science 1h
Keynote + discussion
Orateur: Konstantin Leyde (ICG Portsmouth)
-
09:15
-
10:15
→
10:20
Flash talks #2 Buffon Amphitheater
Buffon Amphitheater
Université Paris Cité
15 rue Hélène Brion 75013 Paris-
10:15
Deep K-Correct: Estimating K-Corrections and Absolute Magnitudes from Galaxy Images 5m
The estimation of K-corrections and absolute magnitudes are essential in extragalactic astronomy for comparing galaxy properties across different redshifts. The state-of-the-art methods rely on deterministic template-fitting techniques: Blanton’s KCorrect, which estimates K-corrections and absolute magnitudes from photometry and redshift, and FastSpecFit, which refines these estimates by incorporating spectroscopic data. However, spectroscopy is expensive and often unavailable for large galaxy surveys. We introduce Deep K-Correct, a novel approach that leverages the latent space of the astronomical foundation model AstroCLIP to estimate K-corrections and absolute magnitudes directly from galaxy images. Unlike traditional methods, Deep K-Correct eliminates the need for photometric or spectroscopic measurements, utilizing the rich representations learned from millions of galaxies. We explored both zero-shot learning (using K-Nearest Neighbors) and few-shot learning (training neural network layers on top of the embeddings). Our results show that Deep K-Correct matches and marginally surpass the accuracy of KCorrect in predicting K-corrections while requiring only images and redshift as input. We discuss the tunning of the method for the upcoming LSST images and the computation of other physical properties for stars and galaxies using Foundational Models
Orateur: M. Jeremias Rodriguez
-
10:15
-
10:20
→
11:10
Coffee break + poster session: #1 Buffon Amphitheater
Buffon Amphitheater
Université Paris Cité
15 rue Hélène Brion 75013 Paris -
11:10
→
12:30
Gravitational waves Buffon Amphitheater
Buffon Amphitheater
Université Paris Cité
15 rue Hélène Brion 75013 Paris-
11:10
Non-Parametric Normalizing Flow Modeling of Binary Black Hole Populations for Unbiased Dark Siren Cosmology 20m
Gravitational-wave (GW) dark siren methods offer a powerful way to measure the expansion of the Universe without electromagnetic counterparts, but they rely critically on accurate models of the underlying binary black-hole (BBH) population. In fact, conventional parametric descriptions of BBH mass and redshift distributions can lead to significant biases in cosmological inferences when the assumed functional forms are incorrect. We present a fully non-parametric approach using Normalizing Flows (NFs) to model the joint BBH source-frame mass and redshift distribution within a hierarchical Bayesian framework. Training on simulated GW catalogs representative of current observing runs (∼10²–10³ events), we optimize the NF parameters by maximizing the hierarchical likelihood, demonstrating flexible recovery of complex, multimodal features without manual tuning of functional forms. This non-parametric population model might mitigate biases in dark siren cosmology arising from oversimplified population assumptions. Future work will extend this framework to full posterior sampling of the NF parameters and application to real GW data.
Orateur: LEONARDO Iampieri (Università di Roma La Sapienza, INFN,) -
11:30
GW parameters estimation with DNN 20mOrateur: Edward Porter (APC/CNRS)
-
11:50
Disentangling the Gravitational Symphony: Machine Learning for LISA's Global Fit 20m
The upcoming ESA mission LISA (Laser Interferometer Space Antenna), scheduled for launch in 2037, will usher in a transformative era in astrophysics by detecting gravitational waves from space. Unlike traditional observatories limited to the electromagnetic spectrum, LISA will directly probe the fabric of space-time, revealing a new and complex landscape of astrophysical signals from galactic binaries, massive black hole mergers, and extreme mass ratio inspirals (EMRIs). However, the immense scientific potential of LISA hinges on solving an unprecedented data analysis challenge: the Global Fit problem. This involves the simultaneous inference of numerous overlapping signals and instrument noise, framed in a high-dimensional Bayesian setting.
Current approaches rely on computationally intensive Markov chain Monte Carlo (MCMC) techniques with block Gibbs sampling across source classes. Yet, these methods suffer from poor scalability and slow convergence, especially in the presence of source confusion and uncertainty in source number. To address these issues, we introduce GWINESS (Gravitational Wave Inference using NEural Source Separation), a machine learning-based framework inspired by music source separation. Using an encoder-decoder neural architecture, GWINESS aims to perform blind source separation of overlapping gravitational-wave signals—analogous to isolating vocals, drums, and bass in a song. By pre-processing LISA data and identifying distinct source components (e.g., MBHBs, EMRIs, GBs), we will try to accelerate convergence and to improve the initialization of classical MCMC pipelines.
This talk will present the core principles behind GWINESS, highlight the challenges of the Global Fit for LISA, and demonstrate how hybridizing physics-based inference with deep learning can dramatically reduce computational costs. We will discuss current limitations, and future directions for integrating ML methods in LISA's Global Fit.
Orateur: Antsa Rasamoela (L2I Toulouse, CNRS/IN2P3, Université de Toulouse) -
12:10
The plug-and-play approach: learning the prior distribution from simulations. Application to gravitational wave reconstruction 20m
Current efforts in the LIGO-Virgo-KAGRA (LVK) collaboration focus on either parameter estimation from the observations of all available detectors, or denoising the measurements from a single detector. In this work, we address an intermediate task: reconstructing the original gravitational wave (GW) from the measurements of all available detectors. This inverse problem is ill-posed, and thus requires the choice of a prior distribution that has a critical impact on the reconstruction quality. In this presentation, we will describe how we adapted the plug-and-play approach (that learns a general prior from a dataset using deep learning, and currently produces state-of-the-art results on natural image processing) to the reconstruction of GWs. We will also show an application to a synthetic signal, and detail the key geometric features the learned prior preserves.
Orateur: Pierre PALUD (APC)
-
11:10
-
13:45
→
14:00
welcome back coffee 15m Buffon Amphitheater
Buffon Amphitheater
Université Paris Cité
15 rue Hélène Brion 75013 Paris -
14:00
→
16:30
Static sky cosmology Buffon Amphitheater
Buffon Amphitheater
Université Paris Cité
15 rue Hélène Brion 75013 Paris-
14:00
Using AI to probe the invisible universe: machine learning methods for weak gravitational lensing 1h
Keynote talk + discussion session
Orateur: Samuel Farrens (CosmoStat, CEA Paris-Saclay) -
15:00
Break 30m
-
15:30
Outliers and Hidden Relations in Galaxy Spectroscopy and Photometry 20m
I will present the neural network architect "spender", which is specifically designed for galaxy spectra at variable redshifts. Trained in 500k SDSS or DESI spectra, it is capable of automatically detecting highly meaningful outliers as well as making predictions of the physical state of the galaxies, thus serving as a summary for simulation-based inference approaches. Recently, my group has further demonstrated that the representations learned from optical spectra provide accurate prediction of IR photometry, a connection that is not captured by current physical SED modeling methods. I will also discuss extensions of this work to exoplanets and quasar spectra to demonstrate the strengths and versatility of this approach.
Orateur: Peter Melchior (Princeton University) -
15:50
Pixel level Bayesian analysis of PSFs and Strong Gravitational Lenses 20m
Score Based Diffusion models have emerged as a powerful tool for high dimensional Bayesian analysis. Here we present work done at the Université de Montréal to analyze strong gravitational lenses. As a first step, we sample robust posteriors over high resolution point spread function models to represent the image distortions in the Hubble Space Telescope. We then perform source reconstruction for strong gravitational lenses, effectively undoing the warping caused by gravitational lensing. These methods perform extraordinarily well and set a new standard for state of the art analysis of these systems. Our posterior samples are simultaneously higher likelihood and visually indistinguishable from unlensed galaxies.
Orateur: Connor Stone (Université de Montréal) -
16:10
Learning to Deblend Galaxies from Blended Observations with Diffusion Models 20m
Galaxy surveys such as LSST require robust deblending methods to separate overlapping sources in crowded fields, a challenging inverse problem due to PSF convolution, noise, and source mixing. In this talk, I present a Bayesian framework that leverages diffusion models to learn a prior on galaxy light profiles directly from blended observations. Building on a recent expectation-maximization approach for training diffusion models with corrupted data, our method enables full posterior sampling as well as MAP estimation for each deblended source. The likelihood is explicitly defined and differentiable, thanks to our scarlet2 framework, which models PSF effects, source mixing, and resampling, making it suitable for multi-resolution imaging.
Looking ahead, the next decade of galaxy surveys will deliver unprecedented overlapping data across multiple wavelengths, epochs, and resolutions. Our framework enables joint analysis of these multi-survey datasets, improving the quality of galaxy catalogs and opening the door to a wide range of astrophysical and cosmological applications, including time-domain discoveries and better constraints on fundamental parameters.
Orateur: Benjamin Remy (Princeton University)
-
14:00
-
16:30
→
17:00
break 30m Buffon Amphitheater
Buffon Amphitheater
Université Paris Cité
15 rue Hélène Brion 75013 Paris -
17:00
→
18:15
Round tables: Climate change and machine learning: can AI be frugal?" Buffon Amphitheater
Buffon Amphitheater
Université Paris Cité
15 rue Hélène Brion 75013 ParisPrésident de session: Cécile Roucelle (APC)-
17:00
Climate change and machine learning: can AI be frugal?" 1h
- Estimer l'impact environnemental d'un système d'IA - Green Algorithms : http://calculator.green-algorithms.org/ai
- Suivre l'impact carbone + énergie de son code : https://codecarbon.io/ - code source : https://github.com/mlco2/codecarbon
- Estimer l'impact environnemental d'un call API à un LLM : calculateur en ligne : https://huggingface.co/spaces/genai-impact/ecologits-calculator - code : https://github.com/genai-impact/ecologits
- Comparer des modèles : https://huggingface.co/AIEnergyScore
- Compresser des modèles : https://github.com/PrunaAI/pruna
Orateurs: Claire Monteleoni, Cécile Roucelle (APC), Juliette Fropier, Laurent Daudet, Romaric David
-
17:00
-
19:00
→
22:00
Social event: Apéritif dînatoire au café Cayo Buffon Amphitheater
Buffon Amphitheater
Université Paris Cité
15 rue Hélène Brion 75013 Paris
-
09:00
→
09:15
-
-
09:00
→
10:00
Bayesian inferences Buffon Amphitheater
Buffon Amphitheater
Université Paris Cité
15 rue Hélène Brion 75013 Paris-
09:00
Neural compression and neural density estimation for cosmological inference 1h
keynote talk + discussion session
Orateur: Justine Zeghal (APC)
-
09:00
-
10:00
→
10:20
Flash talks #2 Buffon Amphitheater
Buffon Amphitheater
Université Paris Cité
15 rue Hélène Brion 75013 Paris-
10:00
Bayesian neural network for active learning and applications to gravitational-wave source population inference 5m
LIGO and Virgo have detected about a hundred of merging compact binaries in their first three science runs. This number will grow by a factor of 2 to 3 with the current run O4. This will allow a detailed analysis of the source population and possibly identify their origin and formation channel(s). The complex sequence leading to the formation of a compact binary from an isolated binary of stars can be modeled using hydrodynamic star evolution simulations such as MESA. These simulations have a large computational cost and can typically extend to hours each.
We propose to train a neural network to replace these simulations. Training a neural network with simulated data can be challenging if the computational cost of acquiring a large enough training dataset is high. Often, we rely on training data computed on a regularly spaced grid, or based on an a priori defined scheme. But is there a way to optimize the choice of this training dataset ? Can we maintain the performance, or achieve better results with less training data ?
Active learning aims at efficiently sampling the training dataset for machine learning applications. We start with a small pool of points that is progressively extended with samples that are "most informative" for the model. Our approach is to use the uncertainty estimation produced by a Bayesian neural network to choose which new point should be simulated and included within the updated training set.
We will propose first preliminary results on synthetic data. In the future we plan to apply this on MESA simulations of massive stellar couples.
Orateur: Theo Courty (Université Paris Dauphine) -
10:05
Representation Learning for Anomaly Detection and Unsupervised Classification of Variable X-ray Sources 5m
We present a novel representation learning method for downstream tasks like anomaly detection, unsupervised classification, and similarity searches in high-energy data sets. This enabled the discovery of a new extragalactic fast X-ray transient (FXT) in Chandra archival data, XRT 200515, a needle-in-the-haystack event and the first Chandra FXT of its kind. Recent serendipitous discoveries in X-ray astronomy, including FXTs from binary neutron star mergers and an extragalactic planetary transit candidate, highlight the need for systematic transient searches in X-ray archives. We introduce new event file representations, E-t maps and E-t-dt cubes, that effectively encode both temporal and spectral information, enabling the seamless application of machine learning to variable-length event file time series. Our unsupervised learning approach employs PCA or sparse autoencoders to extract low-dimensional, informative features from these data representations, followed by clustering in the embedding space with DBSCAN. New transients are identified within transient-dominant clusters or through nearest-neighbour searches around known transients, producing a catalogue of 3559 candidates (3447 flares and 112 dips). XRT 200515 exhibits unique temporal and spectral variability, including an intense, hard <10 s initial burst, followed by spectral softening in an ~800 s oscillating tail. We interpret XRT 200515 as either the first giant magnetar flare observed at low X-ray energies or the first extragalactic Type I X-ray burst from a faint, previously unknown low-mass X-ray binary in the LMC. Our method extends to data sets from other observatories such as XMM–Newton, Swift-XRT, eROSITA, Einstein Probe, and upcoming missions like AXIS.
Orateur: Steven Dillmann (Stanford University)
-
10:00
-
10:20
→
10:50
Coffee break + poster session: #2 Buffon Amphitheater
Buffon Amphitheater
Université Paris Cité
15 rue Hélène Brion 75013 Paris -
10:50
→
12:30
Bayesian inferences Buffon Amphitheater
Buffon Amphitheater
Université Paris Cité
15 rue Hélène Brion 75013 Paris-
10:50
Benchmarking field-level cosmological inference from galaxy redshift surveys 20m
Field-level inference has emerged as a promising framework to fully harness the cosmological information encoded in next-generation galaxy surveys. It involves performing Bayesian inference to jointly estimate the cosmological parameters and the initial conditions of the cosmic field, directly from the observed galaxy density field. Yet, the scalability and efficiency of sampling algorithms for field-level inference of large-scale surveys remain unclear. To address this, we introduce a standardized benchmark using a fast and differentiable simulator for the galaxy density field based on JaxPM. We evaluate a range of sampling methods, including standard Hamiltonian Monte Carlo (HMC), No-U-Turn Sampler (NUTS) without and within a Gibbs scheme, and both adjusted and unadjusted microcanonical samplers (MAMS and MCLMC). These methods are compared based on their efficiency, in particular the number of model evaluations required per effective posterior sample.
Our findings emphasize the importance of carefully preconditioning latent variables and demonstrate the significant advantage of (unadjusted) MCLMC for scaling to $\geq 10^6$-dimensional problems. We find that MCLMC outperforms adjusted samplers by over an order-of-magnitude, with a mild scaling with the dimension of our inference problem.Orateur: Hugo Simon (CEA Paris-Saclay) -
11:10
Bayesian inverse problem with scattering transform : application to instrumental decontamination 20m
Decontaminating a signal of interest is a recurring challenge in astrophysics and cosmology. Given the stochastic nature of usual contaminations (for instance instrumental, or from cosmological background or Galactic foregrounds), it can be framed as an ill-posed inverse problem. A Bayesian approach is needed to recover a distribution of signals compatible with the observed data. We propose a method to estimate clean signal statistics from a single contaminated observation, assuming only that the signal of interest is well described by a maximum entropy distribution conditioned on scattering transform statistics, and that we are able to sample the contamination distribution. It uses generative modelling conditioned on scattering transform statistics to estimate a simple mapping between clean and contaminated signals in the scattering statistic’s space. Validated on large-scale structure maps with a complex instrumental contamination model (beam + noise + masks), our approach recovers a posterior distribution of key astrophysical statistics—power spectrum, PDF, and Minkowski functionals—down to an order of magnitude below the contamination level.
Orateur: Sebastien PIERRE (LPENS) -
11:30
CMBAgent -- Open Source Planning and Control System for Science with Large Language Model Agents 20m
We may be on the cusp of a paradigm shift in scientific research, where hypotheses, experiments, and interpretations are autonomously generated and implemented by multi-agent AI systems. We will present recent developments in Cosmology and Astrophysics where early prototypes of such systems are already being deployed on cutting-edge observational and simulation datasets.
Orateur: Boris Bolliet (Cambridge) -
11:50
What does it take to make MCMC feasible in very high dimensions? 20m
Sampling from high-dimensional distributions is an important tool in Bayesian inference problems, like cosmological field level inference and Bayesian neural networks (BNN).
Hamiltonian Monte Carlo and its tuning-free implementation NUTS have pushed the limits of typical dimensionalities where sampling is feasible. I will show that this limit can be pushed further by disposing of the Metropolis-Hastings adjustment, at the cost of introducing asymptotic bias. I will show how this bias can be controlled to be negligible compared to the Monte Carlo error, resulting in tuning-free implementations of unadjusted Hamiltonian, Langevin, and Microcanonical Langevin Monte Carlo. I will also show how it can be used to improve sampling performance with massive parallelization. Finally, I will show applications to real-world problems, including BNNs.Orateur: Jakob Robnik (University of California at Berkeley) -
12:10
JaxPM: A JAX-Based Framework for Scalable and Differentiable Particle Mesh Simulations in Cosmology 20mOrateur: Wassim KABALAN (CNRS APC/IN2P3)
-
10:50
-
13:45
→
14:00
welcome back coffee 15m Buffon Amphitheater
Buffon Amphitheater
Université Paris Cité
15 rue Hélène Brion 75013 Paris -
14:00
→
17:00
Time domain astrophysics Buffon Amphitheater
Buffon Amphitheater
Université Paris Cité
15 rue Hélène Brion 75013 Paris-
14:00
Do androids dream of exploding stars and receding galaxies? 1hOrateur: federica bianco (NYU)
-
15:00
break 20m
-
15:20
Bayesian Neural networks and supernova classification 1h
keynote talk + discussion session
Orateur: Anais Moller (Swinburne University) -
16:20
Accounting for Selection Effects in Supernova Cosmology with Simulation-Based Inference and Hierarchical Bayesian Modelling 20m
Type Ia supernovae (SNe Ia) are thermonuclear exploding stars that can be used to put constraints on the nature of our universe. One challenge with population analyses of SNe Ia is Malmquist bias, where we preferentially observe the brighter SNe due to limitations of our telescopes. If untreated, this bias can propagate through to our posteriors on cosmological parameters. In this work, we develop a novel technique of using a normalising flow to learn the non-analytical likelihood of observing a SN Ia for a given survey from simulations, independently of any cosmological model. The learnt likelihood is then used in a hierarchical Bayesian model with Hamiltonian Monte Carlo sampling to put constraints on different sets of cosmological parameters conditioned on the observed data. Firstly, we verify this technique on toy model simulations finding excellent agreement with analytically-derived posteriors. We will conclude by demonstrating that this method recovers the true underlying cosmology when applied to realistic SNANA survey simulations where there is no analytical solution.
Orateur: Benjamin Boyd (University of Cambridge) -
16:40
Classification of Transient Event Candidates in LSST Without Human-Labeled Training Sets 20m
The Vera C. Rubin Observatory’s LSST will detect millions of transient candidates through difference image analysis (DIA), issuing real-time alerts to the community. While DIA is highly sensitive, it is prone to spurious detections caused by noise, artifacts, or imperfect subtractions. Filtering these out—known as the "Real/Bogus" problem—typically relies on supervised machine learning trained on large, manually labeled datasets, which are costly and hard to scale.
We present a novel method for training classifiers without human labels, using synthetic source injection on real survey images to generate reliable training sets. Crucially, we address contamination from real, undetected astrophysical events in the negative class, enabling robust learning despite the absence of direct ground truth.
Our approach eliminates the need for extensive human annotation or unrealistic simulations, and additionally enables the discovery of missed transients in archival data. This opens the door to scalable, label-efficient classification in LSST and other large time-domain surveys.Orateur: Bruno Sanchez (CPPM - CNRS)
-
14:00
-
17:00
→
17:15
Break 15m Buffon Amphitheater
Buffon Amphitheater
Université Paris Cité
15 rue Hélène Brion 75013 Paris -
17:15
→
18:30
Round tables: Agentic AI Buffon Amphitheater
Buffon Amphitheater
Université Paris Cité
15 rue Hélène Brion 75013 Paris-
17:15
Agentic AI 1h 15mOrateurs: Boris Bolliet, Eric AUBOURG (APC), François Lanusse, Pavlos Moraitis
-
17:15
-
09:00
→
10:00
-
-
09:05
→
12:10
Static sky cosmology Buffon Amphitheater
Buffon Amphitheater
Université Paris Cité
15 rue Hélène Brion 75013 Paris-
09:05
Probabilistic inference of galaxy properties using multi-modal variational autoencoders 20m
Variational autoencoders (VAEs) are powerful tools for inferring object properties, particularly well-suited for processing imaging data due to their architectural design. In this work, we use multi-modal VAEs to generate probability distributions for various galaxy properties using multi-band photometric observations. The VAE is trained on synthetic photometric and spectral datasets to infer properties such as redshift, spectral energy distributions (i.e., low-resolution spectra), and stellar population parameters. These inferences are derived from the latent space representations learned from observational or mock data, which may include either integrated galaxy magnitudes or image cutouts of individual sources. We validate our method using both simulated datasets from LSST and real observations from the Hyper Suprime-Cam Subaru Strategic Program (HSC-SSP), supplemented with spectroscopic data from DESI-DR1.
Orateur: Kirill Grishin (Universite de Paris) -
09:25
Interpretable Machine Learning for Constraining Self-Interacting Dark Matter in Galaxy Clusters 20m
Cold dark matter, the standard cosmological model, faces several challenges on small scales that self-interacting dark matter may help resolve. Traditional methods to constrain the nature of dark matter often rely on summary statistics, which discard much of the available information, or require complex and computationally expensive lensing models. Machine learning (ML) has gained traction in astronomy for its ability to extract features from high-dimensional data, but its black box nature raises concerns for scientific inference.
We present an interpretable ML algorithm for constraining the dark matter cross-section from cosmological simulations of galaxy clusters. Our algorithm embeds weak gravitational lensing maps into a low-dimensional latent space based on their similarity, allowing simulations to cluster based on their physical differences. This latent space provides a way to assess whether a test dataset, such as observations, lies within the training domain using a Bayesian framework, indicating whether any predicted parameter is reliable or requires extrapolation. We enforce one latent dimension to represent the dark matter cross-section, with each galaxy cluster acting as a sample from the simulation’s cross-section posterior. By applying ensemble learning, we reduce the predictive uncertainty and tighten the posterior constraints on the cross-section for the test dataset.
Our ML algorithm provides accurate parameter recovery alongside a measure of prediction confidence for improved ML trustworthiness.Orateur: Ethan Tregidga (EPFL) -
09:45
Hybrid Analytical-Deep Solver for reconstructing maps in Cosmology 20m
The revolutionary methods of Machine Learning (ML) support most data science analyses today in many ways. An often neglected question remains on the interpretability of used models, and clarity on how the information inside our data is used. This work presents a physics-guided method combined with the architecture of Deep Learning, to provide both the reliability and explainability of classical statistical techniques while gaining the speed and efficiency of Neural Networks. The work also covers the treatment of uncertainty of estimated physical parameters, often impossible to propagate in conventional usage of networks. The idea will be presented in the context of reconstructing maps and inferring cosmological parameters with QUBIC - a novel experiment for measuring the polarization of the Cosmic Microwave Background by utilizing interferometry. This approach is expandable to other scientific fields, and is highly relevant in current times of rising interest for Explainable Artificial Intelligence (XAI).
Orateur: Leonora Kardum -
10:05
A Comprehensive Analysis of Beyond \(\Lambda\)CDM with Cosmological Data 20m
In this talk the discrepancies include the long-standing difference in the Hubble constant ( H_0 ), as well as variations between Planck data and weak lensing measurements regarding the matter-energy density ( \Omega_m ) and the amplitude ( \sigma_8 ) (or the redshift space distortion ( f \sigma_8 )) of cosmic structures will be explored. These inconsistencies suggest the possibility of new physics beyond the standard $\Lambda$CDM model, such as early dark energy or modified gravity theories. Resolving these tensions is crucial for improving our understanding of the Universe's accelerating expansion and large-scale structure formation, and it may highlight limitations in existing cosmological models.
This study highlights resolving such cosmological tensions through alternative theories of gravity from various observational datasets, including Planck-2018, DESI, CMB, RSD, and SNIa. The observed tensions suggest potential discrepancies between model predictions and observational constraints, indicating a need for thorough evaluations of modified gravity theories as alternatives for cosmic acceleration. Resolving these tensions is critical for improving our understanding of dark energy, matter density, and the dynamics of the Universe. Furthermore, addressing the disparities in these measurements will inform the design of future cosmological surveys and help prioritize observational strategies, potentially leading to significant discoveries about the nature of gravity and the Universe.Orateur: Dr Shambel Akalu (Centre for Space Research, North-West University) -
10:25
coffee break 30m
-
10:55
Concluding Talk - Bayesian Analysis of Multi-Channel Astronomical Data: From Time Domain to Static Sciences 1h 15m
keynote talk + discussion
Orateur: Biswajit Biswas (APC)
-
09:05
-
12:10
→
12:30
Workshop: wrap-up and discussions Buffon Amphitheater
Buffon Amphitheater
Université Paris Cité
15 rue Hélène Brion 75013 Paris
-
09:05
→
12:10