Computational methods, software development and High Performance Computing awareness are of ever-growing importance in Astrophysics and Cosmology. In this context, the additional challenge comes from the impossibility of reproducing experiments in the controlled environment of a laboratory, making simulations unavoidable for testing theoretical models. In this work I present a quite heterogeneous ensemble of projects we have performed in the context of simulations of the large scale structure of the Universe. The connection being the development and usage of original computational tools for the analysis and post-processing of simulated data. In the first part of this manuscript I report on the efforts to develop a consistent theory for the size function of cosmic voids detected in biased tracers of the density field. Upcoming large scale surveys will map the distribution of galaxies with unprecedented detail and up to depths never reached before. Thanks to these large datasets, the void size function is expected to become a powerful statistics to infer the geometrical properties of space-time. In spite of this, the existing theoretical models are not capable of describing correctly the distribution of voids detected, neither in unbiased nor in biased simulated tracers. We have improved the void selection procedure, by developing an algorithm that redefines the void ridges and, consequently, their radii. By applying this algorithm, we validate the volume conserving model of the void size function on a set of unbiased simulated density field tracers. We highlight the difference in the internal structure between voids selected in this way and those identified by the popular VIDE void finder. We also extend the validation of the model to the case of biased tracers. We find that a relation exists between the tracer used to sample the underlying dark matter density field and its unbiased counterpart. Moreover, we demonstrate that, as long as this relation is accounted for, the size function is a viable approach for studying cosmology with voids. Finally, by parameterising the size function in terms of the linear effective bias of tracers, we perform an additional step towards analysing cosmic voids in real surveys. The proposed size function model has been accurately calibrated on halo catalogues, and used to validate the possibility to provide forecasts on the cosmological constraints, namely on the matter density parameter, $Omega_M$, and on the normalisation of the linear matter power spectrum, $sigma_8$. oindent The second part of the manuscript is focused in presenting the hybrid C++/python implementation of ScamPy, our empirical framework for ``painting'' galaxies on top of the Dark Matter Halo/Sub-Halo hierarchy obtained from N-body simulations. Our confidence on the reliability of N-body Dark Matter-only simulations stands on the argument that the evolution of the non-collisional matter component only depends on the effect of gravity and on the initial conditions. The formation and evolution of the luminous component (i.e. galaxies and intergalactic baryonic matter) are far from being understood at the same level as the dark matter. Among the possible approaches for modelling the luminous component, empirical methods are designed to reproduce observable properties of a target (observed) population of objects at a given moment of their evolution. With respect to ab initio approaches (i.e. hydrodynamical N-body simulations and semi-analytical models), empirical methods are typically cheaper in terms of computational power and are by design more reliable in the high redshift regime. Building an empirical model of galaxy occupation requires to define the hosted-object/hosting-halo connection for associating to the underlying DM distribution its baryonic counterpart. The method we use is based on the sub-halo clustering and abundance matching (SCAM) scheme which requires observations of the 1- and 2-point statistics of the target population we want to reproduce. This method is particularly tailored for high redshift studies and thereby relies on the observed high-redshift galaxy luminosity functions and correlation properties. The core functionalities of ScamPy are written in C++ and exploit Object Oriented Programming, with a wide use of polymorphism, to achieve flexibility and high computational efficiency. In order to have an easily accessible interface, all the libraries are wrapped in python and provided with an extensive documentation. I present the theoretical background of the method and provide a detailed description of the implemented algorithms. We have validated the key components of the framework, demonstrating it produces scientifically meaningful results with satisfying performances. Finally, we have tested the framework in a proof-of-concept application at high-redshift. Namely, we paint a mock galaxy population on top of high resolution dark matter only simulation, mimicking the luminosity and clustering properties of high-redshift Lyman Break Galaxies retrieved from recent literature. We use these mock galaxies to infer the ionizing radiation spatial and statistical distribution during the period of Reionization.
From cosmic voids to collapsed structures: HPC methods for Astrophysics and Cosmology / Ronconi, Tommaso. - (2020 Oct 06).
|Titolo:||From cosmic voids to collapsed structures: HPC methods for Astrophysics and Cosmology|
|Data di pubblicazione:||6-ott-2020|
|Appare nelle tipologie:||8.1 PhD thesis|