Gaia Data Release 2: processing of the photometric data
Published version
Peer-reviewed
Repository URI
Repository DOI
Change log
Authors
Abstract
CONTEXT. The second Gaia data release is based on 22 months of mission data with an average of 0.9 billion individual CCD observations per day. A data volume of this size and granularity requires a robust and reliable but still flexible system to achieve the demanding accuracy and precision constraints that Gaia is capable of delivering. AIMS. We aim to describe the input data, the treatment of blue photometer/red photometer (BP/RP) low–resolution spectra required to produce the integrated GBP and GRP fluxes, the process used to establish the internal Gaia photometric system, and finally, the generation of the mean source photometry from the calibrated epoch data for Gaia DR2. METHODS. The internal Gaia photometric system was initialised using an iterative process that is solely based on Gaia data. A set of calibrations was derived for the entire Gaia DR2 baseline and then used to produce the final mean source photometry. The photometric catalogue contains 2.5 billion sources comprised of three different grades depending on the availability of colour information and the procedure used to calibrate them: 1.5 billion gold, 144 million silver, and 0.9 billion bronze. These figures reflect the results of the photometric processing; the content of the data release will be different due to the validation and data quality filters applied during the catalogue preparation. The photometric processing pipeline, PhotPipe, implements all the processing and calibration workflows in terms of Map/Reduce jobs based on the Hadoop platform. This is the first example of a processing system for a large astrophysical survey project to make use of these technologies. RESULTS. The improvements in the generation of the integrated G–band fluxes, in the attitude modelling, in the cross–matching, and and in the identification of spurious detections led to a much cleaner input stream for the photometric processing. This, combined with the improvements in the definition of the internal photometric system and calibration flow, produced high-quality photometry. Hadoop proved to be an excellent platform choice for the implementation of PhotPipe in terms of overall performance, scalability, downtime, and manpower required for operations and maintenance.
Description
Keywords
Journal Title
Conference Name
Journal ISSN
1432-0746
Volume Title
Publisher
Publisher DOI
Sponsorship
STFC (ST/K000756/1)
Science and Technology Facilities Council (ST/L006553/1)
STFC (ST/S000089/1)