The Intergalactic Medium as a Cosmological Probe

Principal Investigator: Matteo Viel


  • Astrophysical Probes of Fundamental Interactions
  • Structures in the Universe and Emergent Phenomena

Abstract: This line of research aims at exploiting the cosmological role of the intergalactic medium (IGM) in standard and non-standard cosmological models, by using state-of-the-art data sets and hydro-dynamical simulations of structure formation. The IGM is a unique probe of linear matter perturbations at intermediate and small scales (0.1-100 comoving Mpc/h) and at redshifts z=2-6. In recent years, the IGM has also been used: to probe the geometry of our Universe by measuring Baryonic Acoustic Oscillations in the 3D transmitted flux; to set constraints on the nature of dark matter and the total neutrino mass; to find a 4 sigma evidence for a dipole-like variation in α (the fine-structure constant) across the sky at the level of 10 ppm; to provide constraints on the D/H ratio and on the amount of baryons in the Universe. All these findings have deep and rich implications for structure formation, theoretical particle physics, and experimental dark matter searches, topics that are all supported by the IFPU. Furthermore, there is a large degree of complementarity between the IGM and other cosmological probes which are sensitive to other scales and redshifts like the Cosmic Microwave Background (CMB), weak gravitational lensing, cluster number counts. Under the data point of view, the investigation of the IGM has mainly developed along these three different lines: collection of a large number (hundred of thousands) of low signal-to-noise low resolution quasar spectra; collection of few hundred high-resolution high signal-to-noise quasar spectra; collection of intermediate sets of medium resolution spectra. The first approach has been followed by SDSS-III (BOSS) in order to measure the BAO peak and the neutrino mass, with a sample in which errors are dominated by systematic effects rather than statistical. The second avenue has been pursued by different groups in order to constrain the nature of dark matter over small scales or measure the variations of fundamental constants using heavy element absorption lines; the last approach has been mainly used to extend observations to the high-redshift domain, to reduce the astrophysical and instrumental nuisances and as an overall consistency check between the other approaches. In this respect, the high resolution spectrograph ESPRESSO, which is currently taking data and in which our team is involved, is expected to: 1) soon provide an increase in the accuracy of the measurement of two fundamental constants (fine-structure and proton-to-electron mass ratio) compared to VLT/UVES or Keck/HIRES clarifying the controversy; 2) hit a new regime in probing the small scale structure of metal and Lyman-alpha forest lines. Both these scientific efforts will be instrumental and important in paving our way to the E-ELT, the next milestone, in which the italian community is deeply involved. Concerning cosmological simulations, the strategy in the last few years has been twofold: on the one hand to gradually increase the resolution of the simulations of the cosmic web in order to achieve numerical convergence for the desired quantities (most importantly the flux power spectrum); on the other hand several efforts have been placed in increasing the physical modelling of the IGM by incorporating effects like galactic feedback (Active Galactic Nuclei and winds) and radiative transfer. Additionally, simulations have been instrumental in producing mock data sets with the desired properties, resembling as closely as possible the observed data sets, including observational and instrumental procedures such as continuum fitting, accurate modelling of the spectrograph resolution, etc. Methodologies have also developed significantly in the last few years. In particular, exploring cosmological, astrophysical, and nuisance parameter space has proven to be difficult and time consuming even with standard Bayesian Monte Carlo Markov Chain analysis. Emulators of the Lyman-alpha forest flux power spectrum and/or interpolation techniques that aim at getting a prediction of the desired quantity in the multi-dimensional parameter space using the results of hydro-dynamical simulations have been developed and tested. Also different cosmological models in which either the nature of dark matter is modified or the dark energy is in the form of quintessence has also been compared to the observational data.

Status of project and perspectives: The project has reached important results in terms of constraints on particle DM models, constraints on variation of fundamental constants, new models of ionizing sources in the high-redshift Universe. and characterization of physical IGM properties from high to low redshift. Important efforts are also focussing on exploiting present instrumentation like ESPRESSO and characterize future science cases (QUBRICS, ANDES@E-ELT).

The plan for the next 3 yrs is the following:

  1. Theory and Simulations: We will define a set of cosmological parameters in the standard LCDM scenarios and in alternative cosmological models: warm DM (WDM), mixed cold and warm DM models (CWDM) models with an arbitrary suppression of power at small scales able to capture different small scales departure (Murgia et al., 2017; Iršič et al., 2017). T2.2: Specify the resolution and box sizes needed to address the scientific goals of NewIGM. Where possible use simulations developed and available to us (Sherwood simulation suite – Bolton et al., 2017; Relics simulation suite – Puchwein et al., 2019). Otherwise, the simulations will be run in dedicated supercomputing machines like the Ulysses supercomputer at SISSA. T2.3: Define, both in the context of LCDM and beyond LCDM scenarios the astrophysical parameters: thermal evolution of the IGM, reionization redshift, astrophysical modelling of radiative transfer (UV/temperature fluctuations), etc.
  2. Finding the Cosmic Beacons (the QUBRICS survey): We will construct a sample of the brightest possible QSOs with z >~ 2.5 for hi-res spectroscopy. In particular, the number of suitable targets in the Southern hemisphere is much smaller than in the North. To fill this gap we started the QUBRICS survey (Calderone et al., 2019, Boutsia et al. 2020, Guarneri et al. 2021). Using a simple machine learning approach we have identified hundreds of new QSOs (with z>2.5 and i<18), among which two extremely bright (i=16.2-16.4) objects at zem=3.9. We plan to further exploit the available optical and IR surveys in the South using advanced techniques, e.g. the probabilistic random forest, which properly handles missing data in the bands of the input surveys and uncertainties in the photometric measurements.
  3. Ionization – Stellar Sources: We will search for analogs of the z>6.5 ionizers near the peak of cosmic star formation of the Universe (z>2, e.g., Du et al., 2019 and references therein), in particular those showing bursty star formation events compatible with large specific star formation rates (>50-100 Gyr-1) and the presence of hot and massive stars. We want (1) to access  the faintest star-forming galaxies ever at z>2; (2) identify the internal constituents of high-z (z>2) galaxies down to star-forming complexes (100 pc) and single star clusters (< 30 pc). Moreover, we will reassess the AGN contribution to reionization in the high-z universe.
  4. Physical Properties: We will improve current constraints on the physical state of the IGM, in particular the thermal state at 2
  5. Interplay and connection with Galaxies: We will study the properties of metal absorption lines and derive constraints on the interaction and connection between the IGM and galaxies in the redshift range z~2-5, using intermediate- to high-resolution, high-SNR QSO spectra, the ultimate goal will be to constrain  the physical properties of the gas related to galaxies and on the evolutionary models of galaxies.
  6. Fundamental physics: We plan to quantitatively analyze the simulations and compare with the latest data available to address fundamental physics questions, B) test the constancy of fundamental constants, in particular the fine-structure constant, α, or the proton-to-electron mass ratio, μ. Claims of variability have already been made but the signal, at the level of few ppm (part per million), is comparable to the value of possible instrumental systematics (2-10 ppm) C) provide new measurements of the D primordial abundance.
  7. Innovative instruments and tools: We plan to  enforce a proper treatment of the currently available data (both observational and simulated), with a focus on the repeatability of the procedures; B) Use the products of data treatment to assess the requirement of future instrumentation (in terms of efficiency, resolution, wavelength coverage, stability, etc.). Case studies for ANDS@E-ELT in terms of Sandage-Loeb test.