We present a machine learning framework to simulate realistic galaxies for the Euclid Survey, producing more complex and realistic galaxies than the analytical simulations currently used in Euclid. The proposed method combines a control on galaxy shape parameters offered by analytic models with realistic surface brightness distributions learned from real Hubble Space Telescope observations by deep generative models. We simulate a galaxy field of 0.4 deg2 as it will be seen by the Euclid visible imager VIS, and we show that galaxy structural parameters are recovered to an accuracy similar to that for pure analytic Sérsic profiles. Based on these simulations, we estimate that the Euclid Wide Survey (EWS) will be able to resolve the internal morphological structure of galaxies down to a surface brightness of 22.5 mag arcsec-2, and the Euclid Deep Survey (EDS) down to 24.9 mag arcsec-2. This corresponds to approximately 250 million galaxies at the end of the mission and a 50% complete sample for stellar masses above 1010.6 M (resp. 109.6 M) at a redshift z ∼ 0.5 for the EWS (resp. EDS). The approach presented in this work can contribute to improving the preparation of future high-precision cosmological imaging surveys by allowing simulations to incorporate more realistic galaxies.
Euclid preparation: XIII. Forecasts for galaxy morphology with the Euclid Survey using deep generative models / Bretonniere, H.; Huertas-Company, M.; Boucaud, A.; Lanusse, F.; Jullo, E.; Merlin, E.; Tuccillo, D.; Castellano, M.; Brinchmann, J.; Conselice, C. J.; Dole, H.; Cabanac, R.; Courtois, H. M.; Castander, F. J.; Duc, P. A.; Fosalba, P.; Guinet, D.; Kruk, S.; Kuchner, U.; Serrano, S.; Soubrie, E.; Tramacere, A.; Wang, L.; Amara, A.; Auricchio, N.; Bender, R.; Bodendorf, C.; Bonino, D.; Branchini, E.; Brau-Nogue, S.; Brescia, M.; Capobianco, V.; Carbone, C.; Carretero, J.; Cavuoti, S.; Cimatti, A.; Cledassou, R.; Congedo, G.; Conversi, L.; Copin, Y.; Corcione, L.; Costille, A.; Cropper, M.; Da Silva, A.; Degaudenzi, H.; Douspis, M.; Dubath, F.; Duncan, C. A. J.; Dupac, X.; Dusini, S.; Farrens, S.; Ferriol, S.; Frailis, M.; Franceschi, E.; Fumana, M.; Garilli, B.; Gillard, W.; Gillis, B.; Giocoli, C.; Grazian, A.; Grupp, F.; Haugan, S. V. H.; Holmes, W.; Hormuth, F.; Hudelot, P.; Jahnke, K.; Kermiche, S.; Kiessling, A.; Kilbinger, M.; Kitching, T.; Kohley, R.; Kummel, M.; Kunz, M.; Kurki-Suonio, H.; Ligori, S.; Lilje, P. B.; Lloro, I.; Maiorano, E.; Mansutti, O.; Marggraf, O.; Markovic, K.; Marulli, F.; Massey, R.; Maurogordato, S.; Melchior, M.; Meneghetti, M.; Meylan, G.; Moresco, M.; Morin, B.; Moscardini, L.; Munari, E.; Nakajima, R.; Niemi, S. M.; Padilla, C.; Paltani, S.; Pasian, F.; Pedersen, K.; Pettorino, V.; Pires, S.; Poncet, M.; Popa, L.; Pozzetti, L.; Raison, F.; Rebolo, R.; Rhodes, J.; Roncarelli, M.; Rossetti, E.; Saglia, R.; Schneider, P.; Secroun, A.; Seidel, G.; Sirignano, C.; Sirri, G.; Stanco, L.; Starck, J. -L.; Tallada-Crespi, P.; Taylor, A. N.; Tereno, I.; Toledo-Moreo, R.; Torradeflot, F.; Valentijn, E. A.; Valenziano, L.; Wang, Y.; Welikala, N.; Weller, J.; Zamorani, G.; Zoubian, J.; Baldi, M.; Bardelli, S.; Camera, S.; Farinelli, R.; Medinaceli, E.; Mei, S.; Polenta, G.; Romelli, E.; Tenti, M.; Vassallo, T.; Zacchei, A.; Zucca, E.; Baccigalupi, C.; Balaguera-Antolinez, A.; Biviano, A.; Borgani, S.; Bozzo, E.; Burigana, C.; Cappi, A.; Carvalho, C. S.; Casas, S.; Castignani, G.; Colodro-Conde, C.; Coupon, J.; De La Torre, S.; Fabricius, M.; Farina, M.; Ferreira, P. G.; Flose-Reimberg, P.; Fotopoulou, S.; Galeotta, S.; Ganga, K.; Garcia-Bellido, J.; Gaztanaga, E.; Gozaliasl, G.; Hook, I. M.; Joachimi, B.; Kansal, V.; Kashlinsky, A.; Keihanen, E.; Kirkpatrick, C. C.; Lindholm, V.; Mainetti, G.; Maino, D.; Maoli, R.; Martinelli, M.; Martinet, N.; Mccracken, H. J.; Metcalf, R. B.; Morgante, G.; Morisset, N.; Nightingale, J.; Nucita, A.; Patrizii, L.; Potter, D.; Renzi, A.; Riccio, G.; Sanchez, A. G.; Sapone, D.; Schirmer, M.; Schultheis, M.; Scottez, V.; Sefusatti, E.; Teyssier, R.; Tutusaus, I.; Valiviita, J.; Viel, M.; Whittaker, L.; Knapen, J. H.. - In: ASTRONOMY & ASTROPHYSICS. - ISSN 0004-6361. - 657:(2022), pp. -1. [10.1051/0004-6361/202141393]
Euclid preparation: XIII. Forecasts for galaxy morphology with the Euclid Survey using deep generative models
Carbone C.;Kunz M.;Mansutti O.;Meneghetti M.;Moresco M.;Moscardini L.;Pettorino V.;Stanco L.;Weller J.;Baldi M.;Camera S.;Romelli E.;Baccigalupi C.;Burigana C.;Castignani G.;Maino D.;Martinelli M.;Renzi A.;Teyssier R.;Viel M.;
2022-01-01
Abstract
We present a machine learning framework to simulate realistic galaxies for the Euclid Survey, producing more complex and realistic galaxies than the analytical simulations currently used in Euclid. The proposed method combines a control on galaxy shape parameters offered by analytic models with realistic surface brightness distributions learned from real Hubble Space Telescope observations by deep generative models. We simulate a galaxy field of 0.4 deg2 as it will be seen by the Euclid visible imager VIS, and we show that galaxy structural parameters are recovered to an accuracy similar to that for pure analytic Sérsic profiles. Based on these simulations, we estimate that the Euclid Wide Survey (EWS) will be able to resolve the internal morphological structure of galaxies down to a surface brightness of 22.5 mag arcsec-2, and the Euclid Deep Survey (EDS) down to 24.9 mag arcsec-2. This corresponds to approximately 250 million galaxies at the end of the mission and a 50% complete sample for stellar masses above 1010.6 M (resp. 109.6 M) at a redshift z ∼ 0.5 for the EWS (resp. EDS). The approach presented in this work can contribute to improving the preparation of future high-precision cosmological imaging surveys by allowing simulations to incorporate more realistic galaxies.File | Dimensione | Formato | |
---|---|---|---|
aa41393-21.pdf
accesso aperto
Tipologia:
Versione Editoriale (PDF)
Licenza:
Creative commons
Dimensione
3.16 MB
Formato
Adobe PDF
|
3.16 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.