Présidents de session
Talks: Wednesday A
- Francois Boulanger
Talks: Wednesday (shorter)
- Il n'a pas de président de session pour ce bloc
Talks: Thursday A
- Il n'a pas de président de session pour ce bloc
Talks
- Il n'a pas de président de session pour ce bloc
Talks: Thursday B
- Il n'a pas de président de session pour ce bloc
Talks: Thursday C
- Il n'a pas de président de session pour ce bloc
Talks
- Il n'a pas de président de session pour ce bloc
Talks: Friday A
- Il n'a pas de président de session pour ce bloc
Talks
- Il n'a pas de président de session pour ce bloc
Talks
- Il n'a pas de président de session pour ce bloc
Talks
- Il n'a pas de président de session pour ce bloc
In the gravitational wave astronomy the main measure that we infer from the observation is the posterior distribution of the parameters that describe the model of the gravitational wave for a particular source.
The posterior distribution is estimated by utilising Bayes theorem and assuming that we know the model for the signal and that the noise is Gaussian with certain power spectral...
Deep learning models offer several appealing properties for solving the inverse problem on transit light curves: they can learn arbitrary time-correlated noise distributions, provided there are enough examples; they are commonly scalable with respect to the number of examples and free parameters; they are highly flexible by allowing any differentiable module to be integrated. We discuss...
With the advent of new ground and space-based instruments that image exoplanets and record their spectra across a broader wavelength range and at higher spectral resolutions, complex atmospheric models are becoming crucial for a thorough characterization. This includes a detailed description of the clouds and their physics. However, since the microphysics of the clouds is not observable, this...
The exact nature of dark matter (DM) remains still unknown and 2-body decaying dark matter model, one of the minimal extensions of standard cold dark matter model ($\Lambda$CDM), has been shown to be an interesting dark matter candidate, namely for a potential of relaxing a famous $\sigma_8$-tension. Moreover, there have even been studies reporting a preference of this model over standard...
We train convolutional neural networks to correct the output of fast and approximate N-body simulations at the field level. Our model, Neural Enhanced COLA, --NECOLA--, takes as input a snapshot generated by the computationally efficient COLA code and corrects the positions of the cold dark matter particles to match the results of full N-body Quijote simulations. We quantify the accuracy of...
Nested Sampling is an established numerical technique for optimising, sampling, integrating and scanning a priori unknown probability distributions. Whilst typically used in the context of traditional likelihood-driven Bayesian inference, it's capacity as a general sampler means that it is capable of exploring distributions on data [2105.13923] and joint spaces [1606.03757].
In this talk...
“Likelihood-Free inference allows scientists to perform traditional analyses such as parameter estimation and model comparison in situations where the explicit computation of a likelihood is impossible. Amongst all methods, Density Estimation LFI (DELFI) has excelled due to its efficient use of simulations.
However, despite its undeniable promise, current DELFI applications rely on a key...
The frontier of likelihood free inference typically involves density estimation with Neural Networks at it's core. The resulting surrogate model used for inference is faced with the well established challenges of capturing the modelling and parameter uncertainties of the network. In this contribution I will review progress made in building Neural Networks trained with Nested Sampling, which...
Simulation-based inference techniques will play a key role in the analysis of upcoming astronomical surveys, providing a statistically rigorous method for Bayesian parameter estimation. However, these techniques do not provide a natural way to perform Bayesian model comparison, as they do not have access to the Bayesian model evidence.
In my talk I will present a novel method to estimate...
Modeling strong gravitational lenses in order to quantify the distortions of the background sources and reconstruct the mass density in the foreground lens has traditionally been a major computational challenge. This requires solving a high dimensional inverse problem with an expensive, non-linear forward model: a ray-tracing simulation. As the quality of gravitational lens images increases...
New methodologies to characterise and model the galaxy population as a function of redshift that allow to overcome biases and systematic effects are becoming increasingly necessary for modern galaxy surveys. In this talk, I'm going to describe a novel method we developed for the measurement of galaxy population properties ([Tortorelli+18][1], [Tortorelli+20][2], [Tortorelli+21][3]) that relies...
Late-time measurements of the Hubble Constant (H0) are in strong disagreement with estimates provided by early-time probes. As no consensus on an explanation for this tension has been reached, new independent measurements of H0 are needed to shed light on its nature. In this regard, multi-messenger observations of gravitational-wave standard sirens are very promising, as each siren provides a...
Type Ia supernovae (SNIa) are standardisable candles that allow tracing the expansion history of the Universe and constraining cosmological parameters, particularly dark energy. State-of-the-art Bayesian hierarchical models scale poorly to future large datasets, which will mostly consist of photometric-only light curves, with no spectroscopic redshifts or SN typing. Furthermore,...
The latest measurements of the Hubble constant, H$_0$, by local probes like supernova and early Universe probes like the Cosmic Microwave Background are still at a ~$5 \sigma$ tension with each other. Time delay cosmography with strong gravitational lensing is one of the alternative independent methods that could shed light on this tension. The upcoming Legacy Survey of Space and Time should...
Recent advances in simulation-based inference algorithms using neural density estimators have demonstrated an ability to achieve high-fidelity posteriors. However, these methods require a large number of simulations, and their applications are extremely time-consuming.
To tackle this problem, we are investigating SBI methodologies that can make use of not only samples from a simulator...
We present a novel methodology to address high-dimensional posterior inference in a situation where the likelihood is analytically known, but the prior is intractable and only accessible through simulations. Our approach combines Neural Score Matching for learning the prior distribution from physical simulations, and a novel posterior sampling method based on Hamiltonian Monte Carlo and an...
Parametric stochastic simulators are ubiquitous in science, often featuring high-dimensional input parameters and/or an intractable likelihood. Performing Bayesian parameter inference in this context can be challenging. We present a neural simulation-based inference algorithm which simultaneously offers simulation efficiency and fast empirical posterior testability, which is unique among...
I will describe some applications of Truncated Marginal Neural Ratio Estimation (TMNRE) to cosmological simulation-based inference. In particular, I will report on using SBI for CMB power spectra (based on https://arxiv.org/abs/2111.08030) and realistic 21cm simulations (work in progress). Along the way, I plan to discuss some thoughts on how to incorporate active learning scenarios with...
Deep generative models have proved to be powerful tools for likelihood-free inference, providing a promising avenue to address the problem of doing inference in very high-dimensional parameter space, particularly in the context of the upcoming generation of sky surveys. In this talk, I will present our ongoing exploration of the Hierarchical Probabilistic U-Net (HPU-Net) for generating...
We present a technique for constructing suitable posterior probability distributions in situations for which the sampling distribution of the data is not known. This is very useful for modern scientific data analysis in the era of “big data”, for which exact likelihoods are commonly either un- known, computationally prohibitively expensive or inapplicable because of systematic effects in the...
The correct interpretation of detailed astrophysical and cosmological data requires the confrontation with equally detailed physical and detector simulations. From a statistical perspective, these simulations typically take the form of Bayes networks with an often very large number of uncertain or random parameters. These are often intractable to analyse using likelihood-based techniques. In...
Strong lensing is a unique gravitational probe of low-mass dark matter (DM) halos, whose characteristics are connected to the unknown fundamental properties of DM. However, measuring the properties of individual halos in lensing observations with likelihood-based techniques is extremely difficult since it requires marginalizing over the numerous parameters describing configuration of the lens,...
Analyzing the light from strongly-lensed galaxies makes it possible to probe low-mass dark matter (DM) subhalos. These detections can give us insight into how DM behaves at small scales. Traditional likelihood-based analysis techniques are extremely challenging and time-consuming. One has to marginalize over all lens and source model parameters, which makes it practically intractable....
With Euclid and the Rubin Observatory starting their observations in the coming years, we need highly precise and accurate data analysis techniques to optimally extract the information from weak lensing measurements. However, the traditional approach based on fitting some summary statistics is inevitably suboptimal as it imposes approximations on the statistical and physical modelling. I will...
Modern cosmological experiments yield high-dimensional, non-Gaussian posterior distributions over cosmological parameters. These posteriors are challenging to interpret, in the sense that classical Monte-Carlo estimates of summary statistics, such as tension metrics, become numerically unstable. In this talk, I will present recent work where normalizing flows (NF) are used to obtain analytical...
How much cosmological information is embedded in large-scale structure, and can we extract it? Modern cosmological surveys aim to capture rich images or "fields" of evolving cosmic structure but are often too massive to be interrogated pixel-by-pixel at the field level. We demonstrate that simulation-based compression and inference can be equivalent to all-pixel field likelihoods. We compare...
In this seminar, I will discuss challenges arising in cosmological data analysis. Either likelihoods are intractable or systematics in the data cannot be properly modelled. How can we make reliable inference from noise and systematics dominated signals, such as the optical depth to reionization (tau) or the tensor-to-scalar ratio (r) from large angular scale CMB data? Therefore, I will present...
Cosmological weak lensing in the era of modern high-precision cosmology has proven itself to be an excellent probe of key parameters of the standard ΛCDM Model (Lambda Cold Dark Matter). However, the cosmological inference task of working with weak lensing data involves a complex statistical problem to solve within the likelihood function (Jeffrey et al. 2021). Likelihood-free inference (LFI)...
Likelihood-free inference (LFI) allows to evaluate non-trivial likelihood functions, while making it possible to fully propagate all uncertainties from the data vectors to the final inferred parameters. Nevertheless, this necessitates computationally optimised yet realistic forward simulations which are not trivial to procure for a cosmic shear analysis ([Jeffrey et al. 2020][1]).
In this...
Precision analysis of strong gravitational lensing images can in principle characterize the population of small-scale dark halos and consequentially constrain the fundamental properties of dark matter (DM). In reality, this analysis is extremely challenging, because the signal we are interested in has a sub-percent level influence on high-variance data dominated by statistical noise....
Constraining primordial non-Gaussianity using large-scale structure data usually requires accurate predictions of the matter bispectrum, limiting significantly the range of scales which can be considered (linear and midly non-linear regimes).
In this talk, I will present a simulation-based inference approach which allows us to probe the non-linear regime. We combine the modal bispectrum...
In the coming years, a new generation of sky surveys, in particular, Euclid Space Telescope, and the Rubin Observatory's Legacy Survey of Space and Time (LSST) will discover more than 200,000 new strong gravitational lenses, an increase of more than two orders of magnitude compared to currently known samples. Accurate and fast analysis of such large volumes of data within a clear statistical...
Scattering transforms are a new kind of statistics which have been recently developed in data science. They share ideas with convolutional networks, allowing in-depth characterization of non-Gaussian fields, but do not require any training stage. These statistics allow in particular to build realistic generative models from a single image, which can be used as forward model for LFI...
As with any lab experiment, every astronomical survey requires numerous design decisions, from the instrument itself all the way through to the analysis pipelines, ultimately impacting the degree to which its resulting data can answer the most pressing questions about the universe. By collecting vast quantities of uncertainty-dominated data, the upcoming Vera C. Rubin Observatory’s Legacy...