- Indico style
- Indico style - inline minutes
- Indico style - numbered
- Indico style - numbered + minutes
- Indico Weeks View
The next biannual meeting of the Rubin-LSST France community will be held 22-24 November 2021, at LPNHE, Paris.
NB: Attendance will require a valid covid certificate (pass sanitaire).
Organisation:
Registration concerns only in-person participants. It will be possible to follow the meeting on zoom.
https://u-paris.zoom.us/j/84181445336
The agendas of the most recent editions of this meeting can be found online:
Le hall de montage est situé derrière l'amphithéâtre Charpak.
Deep Learning models have been increasingly exploited in astrophysical studies, yet such data-driven algorithms are prone to producing biased outputs detrimental for subsequent analyses. Using galaxy photometric redshift estimation as an example, we propose a set of consecutive steps for resolving two biases in the existing Deep Learning methods, namely redshift-dependent residuals and mode collapse. Experiments show that our methods possess a better capability in controlling biases compared to benchmark methods, and have promises for future cosmological surveys and may be applied to regression problems and other studies that make use of data-driven models.
Exploiting the gigantic flow of transient sky alerts uncovered by LSST will be one of the most outstanding challenges that the community of Time-Domain Astrophysics will face in the coming years. I will present a Phase-A project of a small robotic telescope operating from the Indian Ocean and that will cover 8hours ahead of Chile the fields visited by LSST every night. Reaching typical magnitudes of 20 in the R band, this facility is meant to provide unique constraints on the early phase of the brightest transient sources discovered by LSST, also thanks to an articulation that we plan to set up with the FINK full-stream alert broker. It will also be planed to react to transient sky alerts sent through VOEvents, so as to contribute to the identification and the photometric follow-up of the optical counterparts associated to multi-messenger phenomena such as gravitational waves, neutrino emission, gamma-ray bursts and supernovae. We propose to install this equipment at the "Observatoire du Maido" on Reunion island, which offers excellent astro-climatic conditions and where INSU facilities are already operated.
The deep magnitude reached by the Vera Rubin-LSST combined with the very large field covered every night may allow it to play an important role in the search for electromagnetic counterparts following the detection of gravitational waves emitted by the coalescence of a binary system of two neutron stars (or of some neutron star-black hole systems): kilonova associated with the thermal emission of the dynamical ejecta, afterglow due to the non-thermal emission during the deceleration of the relativistic jet. This search represents a complicated challenge: these sources are weak and initially poorly localized. It is necessary to start by scanning the large error box provided by the gravitational waves to identify the best candidates. A spectro-photometric follow-up by other instruments is then necessary to identify the true counterpart of the coalescence. Even more difficult, the size of the field of view makes it possible to consider the search for "orphan" kilonovae or afterglows, i.e. without any gravitational alert.
In this talk I will describe the scientific motivation of this search, the properties of the probed population, the shape of the expected emission, and then I will discuss how to best prepare this search and in particular to develop efficient criteria for the selection of the best candidates with the Rubin-LSST.
The Legacy Survey of Space and Time (LSST), using the Vera Rubin Observatory LSST Camera at the Simonyi Survey Telescope, aims to survey the southern sky deeper and faster than any wide-field survey to date. Starting in 2024, and for its 10 years of operations, LSST will enable the discovery of an unprecedented large number of astrophysical transients, opening a new era of optical big data in astronomy. Among several challenges, the alert rate forecast for LSST will be at least one to two orders of magnitude larger than any survey to date and it will trigger on typically fainter objects, making it impossible for currently available systems to operate efficiently. In addition, the next decade will see detectors more sensitive to gravitational waves and neutrinos, and new instruments will allow a thorough search of a large part of the sky and a large part of the electromagnetic spectrum, paving the way for multi-messenger astronomy at scale. Fink is an LSST community broker specifically developed to address all these challenges. Designed for fast and efficient analysis of big data, Fink encompasses historical developments and adds state-of-the-art machine learning techniques to generate classification scores for a variety of time-domain phenomena from solar system science to galactic, and extra-galactic science. I will briefly summarize the status of the project, present the current scientific results on the ZTF data stream, and future plans towards Rubin.
The current decade will see the renewal of the ground and spaced based facilities that continuously
observe the transient sky at any wavelength. In particular, in 2023, the French-Chinese mission SVOM
will pursue the multi-wavelength (messenger) study of the explosive transient sky initiated 15 years
ago by the NASA/Swift mission with a core program dedicated to the Gamma-ray Bursts (GRBs). In
the same period, the Vera Rubin observatory will deeply explore the optical transient sky with millions
of alerts to be delivered every night. Among those Vera Rubin transients, a significant number of
optical afterglows emission from cosmological GRBs should be detected every year. Therefore,
scientific synergies can be envisioned between the two projects to fully characterize these (still)
enigmatic events. The recently selected Fink broker will provide the necessary bridge to connect the
SVOM high-energy alerts and the Vera Rubin transient candidates together. In this talk, we present how the
SVOM Collaboration and the Vera Rubin observatory could be complementary for studying the
Gamma-ray Bursts as well as the first implementations done in Fink to make those synergies come
true !
Gamma-ray bursts are the most violent phenomena in the universe and are characterized by a bright ultra-energetic flash of gamma rays lasting from a few seconds to several days. From constraints on the energetics, the emission that is observed at Earth has to come from a highly relativistic beam with an opening angle of just a few degrees and bulk Lorentz factor that can reach a thousand.
What happens however if the beam is not exactly oriented toward the observer?
From simple geometrical consideration, models predict that the gamma ray emission becomes too faint but that emission from the afterglow at lower energies (optical to radio) might peak several days to month later and might be observable.
So far, no such phenomenon could be clearly identified but Rubin LSST data are expected to host tens of such orphan afterglows each year: these are called orphans, because these would be gamma-ray burst with no gamma emission seen.
Rubin LSST will however identify hundreds of thousands of transients each night, so that multiple communities are setting up efficient transient alerts brokers to keep up with that pace. The FINK broker developed at IN2P3 will hence be the primary tool to scan, filter and analyze the Rubin LSST alert stream to try identify orphan afterglows.
I'll briefly present initial work done with a student to look for orphan afterglow in the ZTF alert stream made available through the FINK science portal, and present my plans to move forward to get ready for Rubin LSST data.
In contrast with many surveys that imaged only once their survey
footprint, the LSST will repeatedly image its survey area for a decade.
This aspect is key for Solar System science, owing to the ever-changing
coordinates and photometry of the objects imposed by celestial mechanics.
The LSST is expected to revolutionize the field, by significantly
increasing the total known population of Solar System Objects (SSOs).
Its multi-filter multi-epoch photometry and astrometry will allow
dynamical, physical, and compositional characterizations of SSOs for a
sample several orders of magnitude larger than current census.
I will present an overview of these different aspects.
We present here the impacts of redshift uncertainties on galaxy cluster detection using the Wavelet Z Photometric (WaZP) Cluster Finder on the DC2 Simulation. This is evaluated considering three redshift cases with different levels of complexity: true redshifts, redshifts with gaussan noise and photometric redshifts using BPZ. By comparing the clusters detected in each with the simulation dark matter halos we characterize the cluster selection function, miscentering, redshift and mass proxy estimation.
Weak lensing magnification is an important tool in revealing the masses of galaxy clusters. Galaxies behind a cluster are magnified by the presence of the massive gravitational lens and their apparent position on the sky is deflected away from the lens centre. From such effects we can infer the mass of the lens, however such analysis may also be sensitive to the presence of intra-cluster dust, which will act to reduce the magnitude of the background galaxies. Fortunately we can differentiate between these two different phenomena via the chromatic effects of dust extinction and the different redshift dependence of lensing. For future Rubin magnification measurements it is crucial that the effects of intra-cluster dust are understood in order to fully exploit the weak lensing magnification information. Furthermore such measurements may provide interesting results on the composition of the intra-cluster medium and the cosmic dust density. We investigate the impact of dust using HSC weak lensing data and the Redmapper SDSS galaxy cluster catalogue.
Cosmological analyses with galaxy cluster abundance will be significantly improved with Rubin, moving from the order of thousands of clusters to potentially hundreds of thousands clusters. A standard choice for cosmological cluster analyses is to use Poissonian likelihoods; however such a likelihood neglects the effects of sample variance whereby the anticipated number of clusters at a given mass and redshift is a random realisation of some theoretical underlying number. To date this assumption has been justified but to make the most of the Rubin data, improvements to the cluster likelihood must be considered. The simplest way to deal with the effects of sample variance is by using a gaussian approximation, however with such an approximation we lose valuable cosmological information. In this talk we present a new unbinned cluster likelihood which incorporates the effects of sample variance. We then present a framework, using 1000 cosmological simulations of a Rubin volume universe, in which we can consistently determine the precision of each of the proposed cluster likelihoods: the unbinned Poissonian, the binned Gaussian and a Poissonian/Gaussian mixture likelihood.
As a new addition to Rubin/LSST France, I will use this talk as an excuse to show the community some of what I have been working on, the last couple of years as a DM member, namely the handling of bright stars in the Rubin Science Pipelines. I will also talk about some of the work being carried out in the DESC working group on Point Spread Functions, for which I recently took over as co-convener.
Weak gravitational lensing is one of the most promising tools of cosmology to constrain models and probe the evolution of dark-matter structures. Yet, the current analysis techniques are only able to exploit the 2-pt statistics of the lensing signal, ignoring a large fraction of the cosmological information contained in the non-Gaussian part of the signal. Exactly how much information is lost, and how it could be exploited is an open question.
In this work, we propose to measure the information gain from using higher-order (i.e. non-Gaussian) statistics in the analysis of weak gravitational lensing maps. To achieve this goal, we implement fast and accurate lensing N-body simulations based on the TensorFlow framework for automatic differentiation. By implementing gravitational lensing ray-tracing in this framework, we are able to simulate lensing lightcones to mimic surveys like the Euclid space mission or the Vera C. Rubin Observatory Legacy Survey of Space and Time (LSST). These simulations being based on differentiable physics, we can take derivatives of the resulting gravitational lensing maps with respect to cosmological parameters, or any systematics included in the simulations. Using these derivatives, we can measure the Fisher information content of various lensing summary statistics on cosmological parameters, and thus help maximize the scientific return of upcoming surveys.
I will describe the recent cosmological analysis of weak lensing and clustering data from the Dark Energy Survey 3 years of observations and show the powerful resulting constraints. I will then give my perspective on lessons learnt from this analysis and challenges for the LSST cosmological analysis of these probes, especially focusing on the issue of parameter estimation. This step is indeed computationally intensive and can therefore be a roadblock for the exploration of cosmological models by LSST. I will present results from work done in DES comparing different samplers and my work to optimize this exploration.
We focus in this presentation on the impact of the parameters of a Deep Drilling (DD) strategy (number of fields, cadence of observation, filter allocation, season length, budget) on the size and depth of the Sne Ia sample observed by the survey. We present a method to collect Sne Ia complete up to higher redshifts and propose a set of realistic DD scenarios.
The detailed nature of type Ia supernovae (SNe~Ia) remains uncertain, and as survey statistics increase, the question of astrophysical systematic uncertainties arises, notably that of the evolution of SN~Ia populations. We study the dependence on redshift of the SN~Ia \texttt{SALT2.4} light-curve stretch, which is a purely intrinsic SN property, to probe its potential redshift drift. The SN stretch has been shown to be strongly correlated with the SN environment, notably with stellar age tracers. We modeled the underlying stretch distribution as a function of redshift, using the evolution of the fraction of young and old SNe~Ia as predicted using the SNfactory dataset, and assuming a constant underlying stretch distribution for each age population consisting of Gaussian mixtures. We tested our prediction against published samples that were cut to have marginal magnitude selection effects, so that any observed change is indeed astrophysical and not observational in origin. In this first study, there are indications that the underlying SN~Ia stretch distribution evolves as a function of redshift, and that the age drifting model is a better description of the data than any time-constant model, including the sample-based asymmetric distributions that are often used to correct Malmquist bias at a significance higher than 5 $\sigma$. The favored underlying stretch model is a bimodal one, composed of a high-stretch mode shared by both young and old environments, and a low-stretch mode that is exclusive to old environments. The precise effect of the redshift evolution of the intrinsic properties of a SN Ia population on cosmology remains to be studied. The astrophysical drift of the SN stretch distribution does affect current Malmquist bias corrections, however, and thereby the distances that are derived based on SN that are affected by observational selection effects. We highlight that this bias will increase with surveys covering increasingly larger redshift ranges, which is particularly important for the Large Synoptic Survey Telescope.
With measured distances to >100,000 SNeIa, LSST is the future of supernova cosmology. However, two uncertainties beyond LSST’s control will likely dominate any cosmological analysis: a nearby sample is needed to anchor the Hubble diagram, and the rate, diversity and intrinsic properties of SNeIa must be precisely measured to ensure that our understanding of dark energy is unbiased.
The Zwicky Transient Facility (ZTF) is specifically designed to achieve both of these goals. Observing half of the visible sky every two nights, and spectroscopically typing all events to z=0.05, the ZTF dataset is both 10 times larger than any existing dataset, but also unbiased. Upon conclusion, with measured distances to over 5000 SNeIa, ZTF will be the primary anchor for the LSST Hubble diagram. In the meantime, this dataset is ripe to understand the intrinsic properties of SNeIa. With >2000 cosmological SNeIa already identified, classified and analysed, I will discuss the first results from this survey. Highlights include measurements of the photometric properties of SNeIa and their diversity, how these properties correlate with their local environment and how to optimally, and unbiasedly, standardise the luminosity of these events for use in a cosmological analysis. Understanding these relationships is key to the success of LSST, but only ZTF can measure them.
A large variety of cosmological observations has validated the $\Lambda$CDM model as the leading one in driving the dynamics of the Universe. This model requires the validity of several assumptions : the Cosmological Principle (homogeneity and isotropy at large scales). Despite numerous successes, the standard model is facing some tensions like the measurement of large scale velocity flows. Some measurements are in agreement with what is predicted by $\Lambda$CDM (Colin et al. 2011, Planck Collaboration XIII 2014, Carrick et al. 2015) while others are not (Kashlinsky et al. 2008; Feldman, Watkins & Hudson 2010; Abate & Feldman 2012; Watkins & Feldman 2015).
Type Ia supernovae (SNe Ia) are cosmological probes able to map the Universe at different scales and measure its dynamics. The new low-z data set from the Zwicky Transient Facility (ZTF) constitutes a unique sample to investigate potential anisotropies in the nearby Universe. I will present my investigations about the bulk flow measurements using ZTF simulations and data.
The Nature of Dark Energy, the mysterious component driving the acceleration of cosmic expansion, is still unknown.
One main approach to constraint its equation of state is to construct a Hubble diagram, the evolution of luminosity distance with respect to the redshift using Type Ia Supernovae (SN Ia) used as luminosity distance indicators.
Measuring distances to SNe Ia requires a model of the SN spectrophotometric evolution, taking into account the intrinsic diversity of SNe Ia.
The model currently in use in the community is called Spectral Adaptive Lightcurve Template 2 (SALT2), developped between 2007 (Guy & al. 2007) and 2010 (Guy & al. 2010).
The state of the art is called SALT2.4 and was trained for the Joint Light Curve Analysis (Betoule & al. 2014).
Recently a model update has been published, called SALT3 (Kenworthy & al 2021), with a new trainig set.
I am currently developing a re-implementation of SALT2 with the goal of improving the general methodology and overall training speed.
In particular, we fit the time of maximum, along with all parameters.
The error model is determined along with the model itself.
Finally, the calibration uncertainty are also propagated in the same minimization.
In this presentation, I will present a short status of the SALT+ development effort, as well as a roadmap for the next few months.