Show simple item record

dc.contributor.authorEuclid Collaboration
dc.identifier.citationEuclid Collaboration. (2022). Euclid preparation : XIII. Forecasts for galaxy morphology with the Euclid Survey using deep generative models. <i>Astronomy and Astrophysics</i>, <i>657</i>, Article A90. <a href="" target="_blank"></a>
dc.description.abstractWe present a machine learning framework to simulate realistic galaxies for the Euclid Survey, producing more complex and realistic galaxies than the analytical simulations currently used in Euclid. The proposed method combines a control on galaxy shape parameters offered by analytic models with realistic surface brightness distributions learned from real Hubble Space Telescope observations by deep generative models. We simulate a galaxy field of 0.4 deg2 as it will be seen by the Euclid visible imager VIS, and we show that galaxy structural parameters are recovered to an accuracy similar to that for pure analytic Sérsic profiles. Based on these simulations, we estimate that the Euclid Wide Survey (EWS) will be able to resolve the internal morphological structure of galaxies down to a surface brightness of 22.5 mag arcsec−2, and the Euclid Deep Survey (EDS) down to 24.9 mag arcsec−2. This corresponds to approximately 250 million galaxies at the end of the mission and a 50% complete sample for stellar masses above 1010.6 M⊙ (resp. 109.6 M⊙) at a redshift z ∼ 0.5 for the EWS (resp. EDS). The approach presented in this work can contribute to improving the preparation of future high-precision cosmological imaging surveys by allowing simulations to incorporate more realistic galaxies.en
dc.publisherEDP Sciences
dc.relation.ispartofseriesAstronomy and Astrophysics
dc.rightsCC BY 4.0
dc.subject.othertechniques: image processing
dc.subject.othergalaxies: structure
dc.subject.othergalaxies: evolution
dc.subject.othercosmology: observations
dc.titleEuclid preparation : XIII. Forecasts for galaxy morphology with the Euclid Survey using deep generative models
dc.contributor.laitosFysiikan laitosfi
dc.contributor.laitosDepartment of Physicsen
dc.rights.copyright© Euclid Collaboration 2022
jyx.fundinginformationWe thank the IAC where the first author was in long term visit during the production of this paper, with a special thanks to the TRACES team for their support. We would also like to thank the Direction Informatique de l’Observatoire (DIO) of the Paris Meudon Observatory for the management and support of the GPU we used to train our deep learning models. We also thank the Centre National d’Etudes Spatiales (CNES) and the Centre National de la Recherche Scientifique (CNRS) for the financial support of the PhD in which this study took place. This work has made use of CosmoHub. CosmoHub has been developed by the Port d’Informació Científica (PIC), maintained through a collaboration of the Institut de Física d’Altes Energies (IFAE) and the Centro de Investigaciones Energéticas, Medioambientales y Tecnológicas (CIEMAT) and the Institute of Space Sciences (CSIC and IEEC), and was partially funded by the “Plan Estatal de Investigación Científica y Técnica y de Innovación” program of the Spanish government. The Euclid Consortium acknowledges the European Space Agency and a number of agencies and institutes that have supported the development of Euclid, in particular the Academy of Finland, the Agenzia Spaziale Italiana, the Belgian Science Policy, the Canadian Euclid Consortium, the Centre National d’Etudes Spatiales, the Deutsches Zentrum für Luft- und Raumfahrt, the Danish Space Research Institute, the Fundação para a Ciência e a Tecnologia, the Ministerio de Economia y Competitividad, the National Aeronautics and Space Administration, the Netherlandse Onderzoekschool Voor Astronomie, the Norwegian Space Agency, the Romanian Space Agency, the State Secretariat for Education, Research and Innovation (SERI) at the Swiss Space Office (SSO), and the United Kingdom Space Agency. A complete and detailed list is available on the Euclid web site (

Files in this item


This item appears in the following Collection(s)

Show simple item record

CC BY 4.0
Except where otherwise noted, this item's license is described as CC BY 4.0