Workshop: Pushing frontiers in astrometry and photometry (1)
- Réza Ansari ()
Workshop: LSST software stack beyond LSST
- Pierre Astier ()
Workshop: Pushing frontiers in astrometry and photometry (2)
- Robert LUPTON (Princeton University)
Workshop: Large photometric surveys (1)
- Robert LUPTON (Princeton University)
Workshop: Pushing frontiers in astrometry and photometry (3)
- Mario Juric ()
Workshop: Computing infrastructure and data management
- Dominique Boutigny (LAPP)
Workshop: Large photometric surveys (2)
- Réza Ansari ()
Fabio Hernandez (CC-IN2P3), Prof. Réza Ansari (LAL)
Prof. Lance Miller (Oxford University)
Accurate knowledge of the Point Spread Function (PSF) is crucial to meet the science goals of the wide area surveys of LSST and other telescopes. Increasing statistical precision in surveys leads to a requirement for increased accuracy in our knowledge of the PSF, which varies across images, with time and with the spectrum of the objects being observed – it is a challenging but interesting...
Mr Peter Melchior (Princeton University)
LSST and similarly deep surveys have to deal with ~40% of all objects being blended with others. The classical deblending approaches are often insufficient for high-quality measurements of galaxy properties. Their failures are related to simplifying assumptions about the galaxy shape or massive parameter degeneracies for complex galaxy models, and the reliance on single-band imaging. I will...
Jim Bosch (Princeton University)
LSST Data Management has committed to processing crowded stellar fields through our image differencing pipelines, but is not mandated to perform direct image photometry in these regions. Exactly what this means in terms of algorithms and the scientific quality of our catalogs is subtle and in some ways uncertain, and in this talk I'll explain the algorithms we're planning to run, attempt to...
Mr Stephen Portillo (Harvard-Smithsonian Center for Astrophysics)
Cataloging is challenging in crowded fields because sources are extremely covariant with their neighbors and blending makes even the number of sources ambiguous. We present the first optical probabilistic catalog, cataloging a crowded (~0.1 sources per pixel brighter than 22nd magnitude in F606W) Sloan Digital Sky Survey r band image from M2. Probabilistic cataloging returns an ensemble of...
Dr Masayuki Tanaka (NAOJ)
HSC-SSP is a 3-layered imaging survey aimed at addressing major astrophysical questions such as the nature of dark matter and dark energy. The HSC data have been processed with a version of the LSST stack and a number of scientific papers have been published. I will give an overview of the survey and its fist public data release.
Robert Lupton (Princeton University)
The HSC is an 800 MPixel imager on the 8.2m Subaru telescope, carrying out a 1400 deg^2 week lensing survey. In many ways this survey is an LSST precursor, and we are reducing the data using prototype LSST pipelines. I shall discuss the state of the pipelines that we are using for the next data release, paying especial attention to algorithms that we know will need to be refined before we...
28. Photometry extraction and validation using the HSC pipeline of HSC+u-band data in HSC-SSP deep fields
Dr Jean Coupon (University of Geneva)
We obtained 300 hours at CFHT to conduct a u-band follow-up (CLAUDS program, u<27) of the HSC-SSP Deep fields (grizY, r<27), over 20 deg2. The u band is primarily used to find drop-out (high redshift) galaxies and compute photometric redshifts. Therefore it is necessary to combine the two datasets in a fully consistent way. To do this we developed tools to import the CFHT stack images into the...
Dominique Boutigny (LAPP)
Philippe Gris (LPC Clermont-Ferrand)
The talk will summarize the attempts made to reprocess the CFHT Deep fields with the stack. The results obtained as well as the future work will be exposed.
Mr Pierre Astier (LPNHE)
In order to optimally use multiple images of the same sky area, one has to accurately map the coordinates of these image onto a common coordinate system. Commonly,the images we get are significantly deeper than reference catalogs, and hence the coordinate mappings benefit from making use of common objects in the images, even if they do not appear in reference catalogs. I will describe some...
Ian Sullivan (University of Washington)
The frequency-dependent spectral index of the atmosphere leads to spectrum-dependent distortions of sources away from zenith. While the effect of bulk refraction is easily corrected during astrometric calibration, this Differential Chromatic Refraction (DCR) introduces distortions that depend on the intrinsic spectrum of a source and vary with the parallactic angle and airmass of the...
32. Advances in astronomical image processing - Solving the problems of image coaddition and image subtraction
Mr Barak Zackay (Weizmann Institute of Science)
While co-addition and subtraction of astronomical images stand at the heart of observational astronomy, the existing solutions for them lack rigorous argumentation, are not achieving maximal sensitivity and are often slow. Moreover, there is no widespread agreement on how they should be done, and often different methods are used for different scientific applications. I am going to present...
Mr Dominique Fouchez (CPPM)
In an introduction part, I will present the basics of the implementation of image differencing with the LSST DM stack. Then I will present some preliminary results obtained by running this software on a subset of the CFHTLS Deep Fields images and some comparisons using the published supernova data of the SNLS collaboration.
Dr Frédéric Arenou (CNRS/GEPI, Observatoire de Paris)
The Gaia mission, currently in operational phase, will be described, from the current status, the organisation of the data processing, to the publication of the catalogues.
Mr Peter Melchior (Princeton University)
I will review the status of the Dark Energy survey, give an outlook on the upcoming results from its weak-lensing program, and describe several of the algorithmic advancements that enable high-precision measurements from ground-based imaging.
Eli Rykoff (SLAC National Accelerator Laboratory)
Many scientific goals for DES require calibration of broadband grizY photometry that is stable in time and uniform over the sky to better than 1%. It is also necessary to limit systematic uncertainty in the calibrated broadband magnitudes due to the spectrum of the source. I present details on the Forward Global Calibration Method (FGCM), which combines data taken with auxiliary...
Nicolas Regnault (LPNHE)
For about a decade, photometric calibration has been identified as a potential limitation of current and future imaging surveys. Uncertainties on the relative (band to band) flux scale are currently the dominant contribution to the systematic error budget affecting SNIa distances. Non-uniformities of the imager response may introduce spurious correlations in the LSS analyses, via the...
Dr Michael Mommert (Northern Arizona University)
PHOTOMETRYPIPELINE (PP) is an automated pipeline that produces calibrated photometry from imaging data through image registration, aperture photometry, photometric calibration, and target identification with only minimal human interaction. PP utilizes the widely used Source Extractor software for source identification and aperture photometry; SCAMP is used for image registration. Both image...
Dr William O'Mullane (AURA/LSST)
In this brief talk we will outline the current Data Management design and approach to handling LSST Data. DM is a multifaceted distributed system. It is also built in a distributed manner so we shall also cover the organization of the DM development work.
Prof. Mario Juric (University of Washington)
The LSST is an integrated survey system. The observatory, telescope, camera and the data management systems will be built to conduct the LSST survey and will not support the 'PI mode' in the classical sense. Instead, the ultimate, science-enabling, deliverable of LSST will be the fully reduced data products and accompanying services. In this talk, we will present the baseline design and...
Fritz Mueller (SLAC)
To satisfy the need to efficiently store, query, and analyze catalogs running into trillions of rows and petabytes of data, we are developing Qserv, a distributed shared-nothing SQL database query system. We describe Qserv's design, architecture, and ability to scale to LSST's data requirements. We illustrate its potential with some current test results.
Fabrice Jammes (CNRS)
French engineers and researchers are very interested in the challenge of the LSST catalog and contribute in particular to validating Qserv and preparing its production phase. In this context, they perform integration tests, based on CFHT data, and ranging from the pixels to the execution of scientific SQL queries. In addition, they design Qserv large scale deployment and management procedure...
Dr David Rousseau (LAL-Orsay)
The LHC experiments are now routinely collecting (also simulating) and analyzing Petabytes of data each year, publishing 200 papers, signed each by 3000 people. Focussing on the ATLAS experiment, the talk will describe the organisation of the data processing that makes it possible: what balance between accuracy (of software and calibrations) and stability was reached, allowing specialized...
Mr Christian Arnault (CNRS)
Spark is a very promising technology offering distributed data and computing mechanisms. At LAL(Orsay) we have started to look at how the typical computing workflows used in LSST could use the Spark eco-system: How to distribute algorithms in a map-reduce approach How to format various data structures to partition them in a distributed file system Thus, a OpenStack based cluster has been...
Dr Marc Sauvage (CEA/DRF/Irfu/SAp)
The preparation of the Euclid mission is progressing, with the instruments entering the production of the Qualification models, and the data processing ground segment in its design review. I will make a brief update of where the mission stands today, and focus on the plans that are made for operation and data processing. Details will also be given on how Euclid relies on external...
Robert Gruendl (University of Illinois)
The Dark Energy Survey covering 5,000 square degrees of the South Galactic Cap along with a dedicated 30 square degree SN survey has now completed four years of observations. The Data Management group (DESDM) has been responsible for the processing, release, and curation of these data products for the DES collaboration and to make them available to the general astronomical community. I will...
Thomas Erben (Universität Bonn, Argelander-Institut für Astronomie)
I will discuss the data processing and data handling of the CFHTLenS and KiDS collaborations. Special emphasis will be given on our algorithms and processing methods for weak gravitational lensing studies. I will summarise the current status of our image processing pipeline THELI which is optimised to produce lensing quality data from current and future optical Wide-Field Imaging Surveys. Our...
Dr Christopher Waters (Pan-STARRS)
This year marks the ten year anniversary of first light of the Pan-STARRS1 (PS1) telescope and 1.4 gigapixel GPC1 camera. In the subsequent decade, the Pan-STARRS Image Processing Pipeline (IPP) has continually evolved, culminating in our first data release of approximately three billion sources and one million image stacks at the end of 2016. In this presentation, I will talk about some of...
Santiago Serrano (Universidad Autónoma de Barcelona)