Specialization & Master Thesis Projects

Proposed projects

Available as 2022-2023 semester projects.

 
Proposed by: Abhijit Bendre (Postdoc) & Jennifer Schober (Group leader)
 
Type of project: Master/Specialization [can be adapted for TPIVb]
 
Project flavor: Theory/Numerical simulations
 
The Interstellar Medium (ISM) is a complex multi-phase environment, whose dynamics is governed by
relativistic cosmic rays thermal and kinetic feedback from SN explosions, and magnetic fields. The pivotal role of the magnetic field in star formation and galactic outflows and ISM dynamics is widely appreciated but not well understood. It is expected that on the length scales of ~10 to ~100 pc magnetic fields will have a significant impact on the dynamics of ISM essentially on the formation of dense cold clouds that lead to the star formation. Moreover, the interpretation of many crucial radio observables (synchrotron emission and Faraday rotation) depends sensitively on the distribution of magnetic fields in various ISM phases. This project is aimed at addressing this issue by statistically analyzing the multi-phase structure of ISM, its dynamical evolution, and its interaction with dynamical strength magnetic fields, mainly by computing the Betty numbers from the Magnetohydrodynamics (MHD) simulations of ISM. By doing so, we seek to gain valuable insights into the role of magnetic fields in shaping the ISM and its physical processes.
 
URL/References:
Local simulations of the interstellar medium:
https://ui.adsabs.harvard.edu/abs/2015AN….336..991B/abstract

Supervisors: LASTRO/CVLab/eSpace (Prof. Jean-Paul Kneib/Stephan Hellmich/Andrew Price)

Type of Project: Master project (can be adapted for TP-IV)

Duration: 14 weeks (Official start/end date: September 19-December 22)

Submission of final report: January 15

Final Presentation: TBD

Prerequisites:

  • Open to coding in Python
  • Interest in image rendering
  • Familiarity with rendering (Blender) and camera basics (I.E. pinhole camera model) is a plus.
  • Interest in computer vision or machine learning is a plus.

Context

As part of a collaborative research project between CVLab and LASTRO, we are exploring novel techniques for determining the rotational and physical properties of space debris. For this purpose, we are currently developing methods for the detection and extraction of space debris observations from large astronomical data archives. These archives contain observational data over a 10-year period and include a large amount of random satellite and space debris observations. On the astronomical images, these objects appear as characteristic streaks, most of which cross the entire detector during the several minutes of exposure time.

In order to monitor the performance of our streak detection methods, we incorporate synthetic streaks to the data before processing. This allows us to determine the detection efficiency and to verify orbit determination routines. Currently, these synthetic streaks are randomly generated features that do not reflect any information on orbit, size or shape. Improving the generation of synthetic streaks incorporating this information would make them appear more realistic and would improve our algorithm’s robustness in the analysis of real streaks.

The goal of our analysis is to obtain as much information as possible from the observed space objects. An important property that we want to determine is the rotation rate axis. This tumbling state can theoretically be obtained from the intensity profile of the observed streak. However, for certain objects or tumbling states, we might not have enough data for a robust analysis. A priori knowledge on the size and shape of the object can be used to generate synthetic data that can be used to constrain the tumbling state of the observed objects.

Project Scope

The goal of this project is to develop a tool that allows the insertion of realistic synthetic observations of space objects into astronomical images. These synthetic observations should be based on an artificial population of space objects that resembles the real population as closely as possible. This population is then used to implant synthetic streaks into real data. While the observatory location, telescope and instrument determine which objects are visible at the time of observations, object shape, observing geometry (illumination conditions), rotation, atmospheric extinction and seeing define the precise appearance of the synthetic streak.The final outcome of this project will be the synthetic population of space objects and a tool that inserts the synthetic streaks into real data. 

Tasks

  • Familiarization with astronomical data archives
    (data products, instruments, sensors, environment that influences the appearance of space objects on astronomical images)
  • Implementation of a rendering engine (Blender)
  • Generation of a synthetic population of space objects, incorporating orbits, shapes, sizes and rotations
  • Development of a tool to implant synthetic streaks on astronomical images

Supervisors: M. Hirschmann (Faculty)

Type of Project: Master project (can be adapted for TP-IV)

Project description: Are new JWST observations of massive galaxies at z>10 questioning our LambdaCDM cosmological model?

Thanks to the new, revolutionary James Webb Space Telescope, galaxies have been discovered to form earlier in cosmic history and are likely more massive than previously thought and predicted by state-of-the-art cosmological simulations assuming a LambdaCDM cosmogony. A number of potential solutions to resolve this tension have been discussed in literature, such as less efficient stellar feedback/UV radiation background, higher star formation efficiency, existence of massive (PopIII) stars producing more UV photons, no significant dust attenuation etc, but hardly any quantitive study has been conducted so far testing these scenarios. In this context, the student would take advantage of a modern galaxy formation model applied to merger trees from a large dark-matter-only simulation to explore the impact of different stellar feedback and star formation models on number counts of UV-bright galaxies at z > 10 and the related UV luminosity function, confronted to new JWST data. These model developments will be used to create novel mock galaxy catalogues for high-redshift galaxy populations to provide an interpretative framework for current and future high-redshift galaxy surveys, such as with JWST.

Supervisors: M. Hirschmann (Faculty), R. Tress (PostDoc)

Type of Project: Master project (can be adapted for TP-IV)

Project description: Understanding the interstellar medium of barred galaxies via next-generation, idealised simulations

The PHANGS survey [1] is observing nearby galaxies at unprecedented high resolution with different revolutionary telescopes, such as ALMA and JWST, to study their complex interstellar medium in great detail. Many of those galaxies are barred disc galaxies. These barred structures are responsible in efficiently driving copious gas towards the centre, inducing complex gas motions, extreme star formation and potentially fuelling the a central active galactic nucleus. To robustly understand and interpret these observations, at GalSpec, we conduct high-resolution, idealised magnetohydrodynamic simulations of such galaxies. In this context, the student would work on the generation of a model of the background stellar and dark matter potential in which the gas evolves, stars form and explode etc (by developing own python routines). This can be done by constructing an analytic fixed background potential in a similar way as in [2]. The IR photometric data of these observed galaxies (which is a proxy for the old stellar component) can be fitted with a disc component, a bulge component, and an exponential bar component. Parameters can then be adjusted with simple isothermal simulations of the gas in such a background potential by comparing to the observed galaxy morphology. These results will be used to conduct a variety of novel, idealised MHD simulation of “PHANGS”-like galaxies to provide a novel, interpretative framework for the observed PHANGS galaxies. 

References:

[1] https://sites.google.com/view/phangs/home?pli=1

[2] https://ui.adsabs.harvard.edu/abs/2020MNRAS.499.4455T/abstract

Supervisors: M.  Hirschmann (Faculty), M. Farcy (PostDoc)

Type of Project: Master project (can be adapted for TP-IV)

Project descriptionWhich physical process(es) can suppress star formation in galaxies at cosmic dawn?

Recent observations have discovered an increasingly large population of massive, quiescent galaxies as early as z>4, which has now been spectroscopically confirmed by first data from the James Webb Space Telescope. The origin of the suppression of star formation already less then 2 million years after the Big Bang remains an unsolved puzzle, and is debated to be related to specific physical processes such as merger events and star bursts, Supernovae explosions, AGN feedback etc. Interestingly, modern, state-of-the-art cosmological simulation largely fail to reproduce the observed number densities of quiescent, massive galaxies. This failure may be related to uncertain sub-grid models adopted, but also to the definition of quiescent galaxies in observations. The proposed project will be based on cosmological simulations of high-redshift galaxies, and aims to explore how different tracers and definitions of quiescence change the comparison of simulations to novel JWST observations of massive quiescent galaxies at z>4. Results will be used for the development of improved feedback models and the conduction of new cosmological simulations. The student will learn to work with modern cosmological simulations, and gain experience in using and developing python packages.

Supervisors: M.  Hirschmann (Faculty), Adele Plat (PostDoc)

Type of Project: Master project (can be adapted for TP-IV)

Project description: Origin of extreme emission-line galaxies at cosmic dawn

New observations with the James Webb Space Telescope have unveiled a population of very high-redshift galaxies with extremely elevated emission-line ratios (e.g. in [OIII]/Hb or [OIII]/[OII] originating from ionised gas in galaxies) compared to that of present-day galaxies. The physical origin of these elevated line ratios is highly debated and could be linked to different extreme conditions of the interstellar medium and various ionisation conditions in galaxies at earliest cosmic epochs. With observations alone, however, it can be very difficult to robustly address this puzzle, i.e. to disentangle the influence of different ISM and radiation properties on line emission. Thus, this project aims at *theoretically* exploring which ISM and ionisation properties are able to cause extreme emission-line galaxies at earliest cosmic epochs, consistent with the new JWST observations. For that, novel emission-line catalogues of simulated galaxies (based on different cosmological simulations) will be employed and compared. Results will be important for the further development and improvement of emission-line modelling of simulated galaxies. The student will learn to work with modern cosmological simulations, photo-ionisation models, and gain experience in using and developing python packages.

Proposed by: David Harvey (Faculty), Yves Revaz (Faculty)

Simulating the formation of galaxies is vital to our understanding how the Universe formed and the underlying physics behind it. By simulating galaxies we can test theories, and probe the nature of the elusive dark matter. However, simulating galaxies can take a long time, in particular, owing to the treatement of the cooling due to  molecules and metals. Indeed, an accurate treatement requires to solve many non-linear inter-dependent equations. If we can speed up the estimation of metal-cooling in galaxies we will be able to dramatically speed up our simulations. In a complete new and unique way, this masters project will aim to use deep learning to quickly and precisely estimate the abundances and cooling of a set of atomic species at play during the formation of galaxies. By constructing a model that can bypass the need to carry out complicated equations we can hopefully speed up the simulations and hence open up the possibility to probe new models of dark matter. The student will get then hands on simulations, machine learning, deep learning and gain experience in using python packages such as tensor flow and the GRACKLE libraries.

 
Proposed by: Yves Revaz (Faculty), Pascale Jablonka (Faculty)
 
 
What is the mass range of the first stars? What are the conditions for them to become black holes or explode at the end of their evolution? What are the chemical signatures expected in either case? Can they be observed and if yes, where should we look ? These are some of  the questions that this project will address. 
To meet this challenge, the work will be based on numerical simulations that reproduce the properties of low mass galaxies. Because they have had very short star formation histories, they are indeed the systems with the highest probability of retaining the signatures of the early supernova explosions. Different models of first stars will be considered, mixing phenomena in the galaxy interstellar medium will be traced and quantified. The results will be used to plan new observations on strategic chemical elements.

 
 
References:
 
Proposed by: Michaela Hirschmann (Faculty), Frédéric Courbin (Faculty)  
 
Type of project: Master project or semester project

Most galaxies contain a supermassive black hole (SMBH) between a few millions to a few hundreds of million times the mass of the Sun. These SMBH accrete mass at the very center of their host galaxy, a process that emits tremendous amount of light. This is called a quasar, which luminosity can outshine that of its host galaxy. But how do the properties of quasars relate to the ones of their host galaxies, in particular their mass properties? Do quasars power galaxies? Or are they formed only in the most massive galaxies? Many of the answers to those questions rely on relations between the mass of SMBH and their host galaxy and their evolution with redshift. We propose to use existing hydrodynamical cosmological simulations to establish such relations, with particular focus on the dark matter content of quasar host galaxies, as this is now a measurable quantity thanks to gravitational lensing.

 
References:
  • Three quasi-stellar objects acting as strong gravitational lenses (Courbin et al. 2012)
  • Concordance between Observations and Simulations in the Evolution of the Mass Relation between Supermassive Black Holes and Their Host Galaxies ( Ding et al. 2022)
 
Proposed by: Yves Revaz (Faculty), Pascale Jablonka (Faculty)
 
Type of project: Master project/Semester project
Discovered only about fifteen years ago, ultra-faint dwarf galaxies (UFDs) are the faintest galaxies known in the Universe. They are also the most dark-matter dominated ones. As such, they are fundamental probes of the nature of the dark matter. While about fifty UFDs are now identified orbiting the Milky Way, their formation and evolution are still poorly understood. Moreover, recent deep spectroscopic observations have revealed signs of tidal features as a consequence of their interaction with the Milky Way. In these conditions, the classical method used to calculate their dark matter content, so far based on the Virial theorem,  is not valid anymore. Proper N-body simulations must be run.
 
In order to address this challenge, models of UFDs galaxies will be first extracted from cosmological simulations. Their evolution will then be followed  in a realistic Milky Way potential. The impact of the gravitational interaction (disruption, elongation, increase of the velocity dispersion) on both the galaxy stellar content and the dark halo will be systematically studied and compared with the observations.
 
 
 
References:
 
Proposed by: Eric Paic (PhD), Frédéric Courbin (Faculty) 
 
Type of project: Master project

The variability of strongly lensed quasars light curves is set by 3 main components: 1- the continuum flux of the source, 2- microlensing by stars in the lens galaxy and 3- reverberation of the continuum by the Broad Line Region (BLR) (see Fig 1. of the first reference paper). These light curves therefore carry information about the structure of the quasar and the content of the lens galaxy. 

A method using the power spectrum of such light curves was designed to disentangle the different variability components and constrain the physical properties of the lens system.

The aim of this project is to refine this method and apply it to the large COSMOGRAIL sample of microlensing light curves.

 
References:
  • Power spectrum fitting of the microlensing light curve of QJ0158-4325: Paic et al.(2021)
  • Microlensing light curves of lensed quasars sample: COSMOGRAIL XIX
 
Proposed by: Eric Paic (PhD), Aymeric Galan (PhD), Frédéric Courbin (Faculty)
 
Type of project: Master project

Monitoring the luminosity of multiple images of a strongly lensed quasar over several years allows us to study the structure of the quasar and the content of the lensing galaxy. Because of Earth revolution and meteorological hazards, the resulting light curves are not regularly sampled and contain large discontinuities. The aim of this project is to use a method called inpainting to predict the signal where it is missing. This method relies on wavelet transforms to decompose the light curves into multiple frequencies, in order to recover realistic and continuous curves that match the statistical properties of the observed ones. Ultimately, those regularly sampled light curves will give better constraints on the physical model of quasar structures.

Another application of inpainting in two dimensions is the prediction of the light distribution of elliptical galaxies, when part of that light is hidden. This is typically the case when masking the light from a background source superimposed to the lens galaxy, to reconstruct solel the light of the lens galaxy. In this part of the project, the goal is to characterize the inpainting method on real observations of isolated galaxies, by artificially creating a mask that hides different portions of those galaxies. Depending on the progress throughout the project, an application to gravitational lens modeling will be envisioned.

 
References:
 
Proposed by: Yves Revaz (Faculty)
 
Type of project: Master project/Semester project
Massive stars are known to end their life as a powerful explosion named core-collapse supernova (CCSN). These explosions not only release a huge quantity of energy, but also release new synthesised chemical elements like carbon, nitrogen, oxygen or iron. By studying the atmosphere of stars. Crucial information on the properties of CCSNe can thus be inferred by studying the composition of stars formed from gas polluted by those supernovae.
 
While stars more massive than about height solar masses where believed to explode as a supernovae,  new stellar models predict that they can also directly collapse to a black hole without releasing any synthetised products. Understanding under which condition a supernova explode is a pressing question in astrophysics.
 
We propose to tackle this problem by studying how massive stars, exploding or not, will pollute the smallest galaxies, the so-called dwarf galaxies. By simulating the evolution of those dwarfs in a cosmological context, including the explosion of massive stars, we will predict the composition of new generations of stars and directly compare them with observed data. Combining numerical predictions with observations will provide new constraints on the properties of supernovae progenitors.
 
 
References:
 
Proposed by: Yves Revaz (Faculty)
 
Type of project: Master project/Semester project
Dwarf galaxies are the faintest galaxies known, however they are by number the most abundant ones. Discrepances between numerical predictions and observations, in particular regarding the size or metal content of the faintest dwarf challenge the current cosmological model.
 
In this work, we propose to improve our model of dwarf galaxies by implementing and testing a new star formation scheme that allows to form and explode individual stars in our galaxy formation code. In this so-called sink particle scheme, sink particles are created from dense and cold gas particles and further generate individual stars.
 
After some update of the code, the student will have first to start by running idealized simulations of inter stellar medium. The method will be further applied to simulate the formation of dwarf galaxy in an evolving portion of the universe.
 
 
 
References:
 
Proposed by: Tianyue Chen (postdoc) and Jean-Paul Kneib (Faculty)
 
Type of project: Master project
Redshift surveys in the radio waveband using 21 cm radiation from neutral hydrogen (HI) provide a competitive probe of the physical properties of the Universe. One way to do radio redshift surveys is through a technique called HI intensity mapping (IM). IM is the study of the large-scale fluctuations in the intensity of a given spectral line emitted by a number of unresolved objects. EPFL is part of the HIRAX project dedicated to HI IM, which is currently building its telescope in South Africa.
 
The focus of this master project is to forecast cosmological constraints from the HIRAX telescope through its cross-correlation with other redshift surveys. The student will learn the latest HI IM technique as well as the state-of-the-art optical galaxy surveys. During this project, the student will use both the analytical fisher matrix method and the numerical simulation for projecting cosmological constraints.  The student is expected to gain hands-on experiences with Python and statistical data analysis at the end of the project.
 
 
 
References:
 
Proposed by: Jiaxi Yu (PhD student), Tianyue Chen (postdoc) and Jean-Paul Kneib (Faculty)
 
Type of project: Master project
 
 

In the last few decades, many cosmological probes have been proposed to study the Universe expanding history and in particular, the Dark Energy. 3D mapping of the Universe measures the 2D structures in the sky at different redshifts (1D), and thus provides time-evolving information of the cosmic evolution history. 3D maps can be obtained either through a galaxy survey using optical telescopes such as SDSS [1], or through radio telescopes by detecting the 21cm emission line from neutron hydrogens in the sky [2].

Quasars, as extremely bright objects in the sky, are frequently observed by optical galaxy surveys to obtain their positions in the expanding Universe. However, due to their broad spectral lines and astrophysical effects, quasar measurements are often subject to redshift and position uncertainties [3]. Meanwhile, 21cm detections from radio telescopes provide accurate redshift information at locations of quasars. Combining the measurements of optical survey and 21cm radio survey will help improve the accuracy of quasar measurements, thus improving its cosmological measurements.

During this project, the students will stack 21cm radio maps at the locations of quasars measured by optical surveys [4]. This is to reduce quasar redshift uncertainties using radio information, with the ultimate goal of improving current cosmological constraints. The student will acquire knowledge of both optical and radio cosmology using 3D maps. The student is expected to gain hands-on experiences with Python and statistical data analysis at the end of the project.

References:

[1] https://classic.sdss.org

[2] https://arxiv.org/pdf/astro-ph/0401340.pdf

[3] https://arxiv.org/abs/1602.03894.pdf    

[4] https://arxiv.org/pdf/2202.01242.pdf

 
Proposed by: Andrei Variu (PhD student), Cheng Zhao (PostDoc), Jean-Paul Kneib (Faculty)
 
 

Cosmic voids are large regions of space in the Universe which have low matter densities. These structures are influenced by the expansion of the Universe and the clustering of matter. Thus, they can be used to better understand the cosmology and the large scale structure.

The purpose of this project is to test whether numerical models could be used to extract cosmological information from the clustering of voids and to perform an analysis on simulations.
The student will learn how to better code in python and how to extract information using Monte Carlo techniques and Bayesian inference methods.

 
References:
  • https://arxiv.org/abs/1904.01030
  • https://arxiv.org/abs/1712.07575
  • https://arxiv.org/abs/0712.3049
 
Proposed by: Andrei Variu (PhD student), Cheng Zhao (PostDoc), Jean-Paul Kneib (Faculty)
 
 

Cosmic voids are large regions of space in the Universe which have low matter densities. These structures are influenced by the expansion of the Universe and the clustering of matter. Thus, they can be used to better understand the cosmology and the large scale structure.

The purpose of this project is to predict the constraining power — on cosmological parameters — of the size distribution of cosmological voids using the Fisher Matrix formalism.
The student will learn how to better code in python and how to estimate the amount of information that an observable carries about an unknown parameter (using Fisher Matrix).

 
References:
  • https://arxiv.org/pdf/1909.11107.pdf
  • https://arxiv.org/pdf/2206.01709.pdf
  • https://arxiv.org/pdf/2205.11525.pdf
  • https://arxiv.org/abs/1511.04299
 
Proposed by: Andrei Variu (PhD student), Jean-Paul Kneib (Faculty)
 
Cosmology Redshift Survey is a survey part of 4MOST project that plans to measure the spectra of 8 million galaxies. Before the spectroscopic survey begins, the targets of interest (Bright Galaxies and Luminous Red Galaxies) must be selected from catalogues based on photometric data. An important feature of these targets must be the homogeneity on the sky. However, given the observational conditions, very bright stars or galactic dust extinction, the homogeneity of targets cannot be assured. Consequently, one has to understand and take into account all this factors before the cosmological analysis is performed.

The purpose of this project is to check all possible sources of inhomogeneity and to create models that can account for them.
The student will learn how to better code in python, make comprehensive figures and analyse large data-sets.

 
 
References:
  • https://arxiv.org/abs/1704.00338
  • https://arxiv.org/abs/1508.04478
  • https://arxiv.org/pdf/2208.08515v1.pdf
  • https://arxiv.org/pdf/2208.08513v1.pdf
  • https://arxiv.org/pdf/1903.02474.pdf
 
Proposed by: Michele Bianco (Postdoc), Jean-Paul Kneib (Faculty)
 
Type of project: Master project [can be adapted for TP-IVb]
 
Project flavor: Computational cosmology & astrophysics
 

Cosmic reionization is studied using Radiative Transfer (RT) codes to simulate the radiative feedback from the primordial galaxies and stars in the early Universe. Part of the simulation uses a series of differential equations to keep track of the evolution of the neutral hydrogen fraction and the gas thermal evolution in the intergalactic medium (IGM) based on the computed ionizing and heating radiation. These simulations constitute the ground-basis experiment for the upcoming Square Kilometre Array (SKA) radio telescope. Ideally, they require huge volumes (~1Gpc size) with many particles (~10^11) to reproduce the relevant cosmological scale. Therefore, RT and heating simulations require considerable computational power as the number of operations needed to solve the differential equations grow exponentially with the number of simulated particles and sources.

Here, we propose to upgrade the existing C2Ray code, broadly used to simulate the cosmic Epoch of Reionisation simulations. This code uses the short-characteristic ray tracing method to compute the propagation of UV radiation from the primordial galaxies and black holes into the IGM. This code was written in the late ’90 (then updated over the years) in Fortran and is CPU paralleled with MPI and OpenMP. However, over the years, computational algorithms evolved. New ray tracing methods (e.g., domain decomposed RT, accelerated RT, etc.) allow calculating the radiation propagation in cosmological simulation with little numerical diffusion. Moreover, these new techniques can be GPU accelerated, something that in the late ’90 was impossible.

Furthermore, there is the intention to implement the current source model of C2Ray, which uses a simple halo-based model that has stayed the same over the years and proposes a pretty simplistic approach. The idea is to implement a model that follows the halo number count in each simulation sub-region, which we could calibrate based on the luminosity function derived from observations such as JWST or WFIRST.

This project would involve 1) embedding some of the core Fortran subroutines into Python and implementing GPU acceleration. 2) implement a modern ray tracing method. 3) implement a new source model based on the sub-grid dark matter halo number density. The final goal is to compare the computational accuracy and speedup of the new ray tracing code with some tests and the new source model, finalized with one or more publications.

 
Proposed by: Michele Bianco (Postdoc), Jean-Paul Kneib (Faculty)
 
Type of project: Master project [can be adapted for TP-IVb]
 
Project flavor: Cosmology simulation, Machine Learning, Radio interferometry simulation
 

The Epoch of Reionization (EoR) is a crucial period in our Universe’s history, which includes the birth of the first radiating sources that influenced the formation and evolution of latter-day structures. These luminous objects produced enough UV radiation (energy ≥ 13.6 eV) to propagate into the intergalactic medium (IGM), ultimately transitioning our Universe from a cold and neutral state to a hot and ionized state. This exciting period is one of the least understood epochs in the Universe’s evolution due to the lack of direct observations.

We can probe the reionization process by observing the redshifted 21-cm neutral hydrogen signal produced during cosmic reionization. The Square Kilometer Array telescope will be sensitive enough to detect 21-cm statistical quantities and produce images of the signal distribution in the sky. A sequence of such 21-cm images at different redshifts will constitute a three-dimensional data set known as a 3D tomographic dataset (or 21-cm lightcone).

Here, we propose to employ SKA tomographic data as a foreshadowing method to identify the region of interest for future and ongoing galaxy observations. By resolving physical scales down to 5 arcseconds in the plane of the sky, SKA can pinpoint the locations of individual ionized bubbles. These regions are of interest for infrared and near-infrared space telescopes (e.g., JWST, WFIRST, Euclid), which aim to observe galaxy cluster formation in the early Universe.

In recent years, Artificial intelligence (AI) and deep learning techniques have been proven to be powerful tools. For this reason, this workshop would involve 1) developing a Bayesian Convolutional Neural Network (BCNN), which predicts the position and area of high-redshift galaxy clusters from SKA 21-cm images based on the ionized region morphology. 2) Run its own 21-cm simulations and use existing and post-process large N-body simulations to test its approach on different EoR source models. 3) Implement extra-galactic Radio point sources and galactic synchrotron emission to test the student approach on different existing foreground contamination avoidance techniques.
Proposed by: A.Neronov
Type of project: Master Project
 
Cosmic rays entering Earth atmosphere produce Extensive Air Showers (EAS) of high-energy particles that can can be observed by an array of particle detectors on the ground. This phenomenon has been discovered back in 1938 by Pierre Auger at Jungrfaujoch laboratory in Switzerland. The EAS detection technique is nowadays used to study the highest energy particles known in nature (ultra-high-energy cosmic rays reaching 100 Exa-electronvolt) and for observations of the highest energy gamma-rays (highest energy gamma-rays reach Peta-electronvolt). One possible type of simple particle detectors is a water tank equipped with a fast photosensor, a Photo-Multiplier Tube (PMT). High-energy particles passing through the tank with the speed faster than speed-of-light in water emit blue Cherenkov light that can be detected with PMT.  The goal of the Master work is to participate in development of a prototype Water Cherenkov Detector Array (WCDA) in Geneva Lake for observations of EAS with energies in the Tera-electronvolt to Peta-Electronvolt. This prototype will be used for a feasibility study of a large WCDA for measurement of cosmic ray electron spectrum up to highest energies and improvement of sensitivity of observations of gamma-ray sky in the Tera-electron to Peta-electronvolt energy range.
Proposed by: A.Neronov
Type of project: Master Project
 
Magnetic fields that are relic of epochs right after the Big Bang can still reside in the low density regions of the Large Scale Structure, between galaxies and galaxy clusters. Detection of these fields and measurement of their properties might provide a valuable “window” on physical processes that have operated in the Early Universe a fraction of a second after the Big Bang. Such measurement is possible with the methods of gamma-ray and radio astronomies, using new observational facilities: gamma-ray Cherenkov Telescope Array (CTA) and radio Square Kilometer Array (SKA). The gamma-ray measurement technique is based on the observation of extended glow around distant extragalactic sources produced by electromagnetic cascade developing along the gamma-ray beam during its propagation through the intergalactic medium. Radio technique is based on the observation of Faraday rotation of the polarised radio signal through the intergalactic medium. The goal of the Master work is to get an overview of possible mechanisms of generation of  magnetic fields during the first microsecond after the Big Bang, their evolution toward the present-day intergalactic magnetic field and to study a possibility to constrain the “cosmological magnetogenesis” scenario with  CTA and SKA observations.
Proposed by: V.Savchenko, A.Neronov
Type of project: Master Project
 
Although the existence of extra-galactic High-Energy Neutrinos is firmly established, their origin is highly debated. Modern High-Energy Neutrino observatories (primarily IceCube) routinely announce detections of new events, and it is thought at at least some of these events are related to transient sources or source activity episodes (Gamma-Ray Bursts, Tidal Disruption Events, Blazars, etc). Observing a coincident multi-wavelength activity of a source at the time and location of the detected neutrino would reveal the origin of the neutrino by associating it with a known source, and give an important insight on the origin of the entire population of known high-energy neutrinos.
In the frame of the proposed master project, the student will explore and select the available data sets, learn and perform astrophysical data analysis using state-of-the art cloud-based computing technologies.
The results of the work will be published in the open collaborative data analysis platform, and might be re-used by the researcher community.
The project is expected to conclude with publishing some astrophysical micro-publications, and possible contribution to scientific papers.
Proposed by: V.Savchenko, A.Neronov
Type of project: Master Project
 
As the number of known distant planets is growing rapidly, some researchers turn their sights on previously unexplored properties of planets in our own Solar system planets. In the last years, unique multi-wavelength observations were collected by several telescopes. Some of these observations remain underexplored. The project will focus on Gamma-ray and X-ray observations of Solar system planets (especially Venus, Jupiter) and will provide an important  insight on the interactions between high-energy particles of solar origin and the planet atmospheres, which have an essential impact on the atmosphere evolution.
In the frame of the proposed master project, the student will explore and select the available data sets, learn and perform astrophysical data analysis using state-of-the art cloud-based computing technologies.
The results of the work will be published in the open collaborative data analysis platform, and might be re-used by the researcher community.
The project is likely to conclude with a contribution to scientific papers.
Proposed by: V.Savchenko, A.Neronov
Type of project: Master Project
 
It was recently discovered that the Universe is full of very short and intense bursts of radio waves: the Fast Radio Bursts. It remains highly unclear what is the origin of these events. Although the only direct observation of a galactic fast radio burst tied it to an extremely magnetized neutron star (a magnetar) it is not certain that this case can be extrapolated to the entire population of the FRBs. Further multi-wavelength and multi-messenger observations are routinely performed to find out any associated emission, which could help to pinpoint the origin of these puzzling events, but due to large space of possible associations, much of the available data remains unexplored.
In the frame of the proposed master project, the student will explore and select the available data sets, learn and perform astrophysical data analysis using state-of-the art cloud-based computing technologies.
The results of the work will be published in the open collaborative data analysis platform, and might be re-used by the researcher community.
The project is expected to conclude with publishing some astrophysical micro-publications, and possible contribution to scientific papers.

CONTEXT

To characterize the physical properties of space debris eSpace/LASTRO is currently expanding its observation possibilities. EPFL has access to the TELESTO telescope, located at the Sauverny observatory close to Geneva which would represent a good observing facility. However, in its current state, the telescope is not very well suited for the observation of space debris. There is no possibility to target or track objects in the orbit of Earth. To overcome these problems and enable TELESTO for space debris observations, the telescope control software needs to be improved. A longer-term goal is the complete automation of the telescope and the work done in this project represents an important step towards achieving this goal.

PROJECT SCOPE

During the project, you will familiarize yourself with the telescope and its software environment. You will learn about requirements of passive optical observations of space debris. In order for the telescope to be used for space debris observations, the control of the various subsystems of the facility needs to be integrated into an easy-to-use interface. The resulting software module should provide a command-line or script-based interface through which observers can create observation plans that are automatically processed by the telescope. The basic requirements for the module are pointing and tracking based on two line elements (TLEs), camera and filter control as well as routines for automated acquisition of calibration frames.

Proposed by: Jennifer Schober & Yves Revaz 
 
Type of project: Master/Specialization [can be adapted for TPIVb]
 
Project flavor: Theory/Numerical simulations
 
Magnetic fields are observed in all kinds of galaxies. Yet their origin, evolution, and effects on astrophysical processes are not fully understood. The leading idea is that weak seed magnetic fields, possibly generated already shortly after the Big Bang, are amplified by so-called magnetohydrodynamical dynamos. These processes convert turbulent kinetic energy into magnetic energy exponentially in time up to saturation.
 
In this project, the student will study different galactic dynamo processes, including the small-scale turbulent dynamo. Then he or she will post-process cosmological simulations of dwarf galaxies to analyze the potential role of dynamo activity. The velocity field will be extracted to estimate the growth rate of the turbulent dynamo. The final magnetic energy can be estimated using results from direct numerical MHD simulations from literature. 
 
Finally, scaling relations between the magnetic field strength and other characteristic galactic parameters, obtained from the set of post-processed simulated dwarf galaxies, will be compared to observational data. 
 
URL/References:
Estimates of magnetic fields in purely hydrodynamic simulations:  https://ui.adsabs.harvard.edu/abs/2019arXiv191107898L/abstract
Observational relation between the magnetic field and the star formation rate in dwarf galaxies:  https://ui.adsabs.harvard.edu/abs/2011A%26A…529A..94C/abstract

Past projects

Proposed by: Abhijit Bendre (Postdoc), Yoan Rappaz (PhD Student), & Jennifer Schober (Group Leader) 
 
Type of project: Master/Specialization [can be adapted for TPIVb]
 
Project flavor: Theory/Data Analysis
 
Kiloparsec-scale coherent magnetic fields are known to exist in almost all disc galaxies. These fields are of a few micro Gauss in strength and seem to show a long-range regularity both in the radial and vertical direction away from the galactic plane. A reason for such a vast magnetization is usually attributed to a turbulent dynamo mechanism, aided by the galactic differential rotation and turbulence driven by supernova explosions.

Observation of partially polarized synchrotron radiation emitted by the relativistic cosmic rays gyrating around the magnetic field lines is one of the key tools in mapping the strength and topology of galactic magnetic fields. In particular, the Faraday rotation measure of radiations constrains the strength and direction of the line of the sight component of the magnetic field, while the planer component’s strength is inferred from the degree of polarization at various frequencies.
 
Using the data from magnetohydrodynamic simulations of supernova-driven interstellar medium, the student will estimate these observables and predict the properties of polarized synchrotron emission at various frequencies and compare with the relevant observations.
 
URL/References:
Local simulations of the interstellar medium:
https://ui.adsabs.harvard.edu/abs/2015AN….336..991B/abstract
 
On the measurement of interstellar magnetic fields by
radio synchrotron emission:
https://ui.adsabs.harvard.edu/abs/2015A%26ARv..24….4B/abstract
The detection of astronomical signal above instrumental noise is a crucial aspect of all astronomy observations. The source-finding algorithms developed for 3D spectral line data (data cubes) are physics agnostic. We propose to develop a source-finding algorithm that uses information about expected galaxy line profiles and shapes to improve sensitivity and specificity, produces estimates of radio source parameters, and provides errors on those parameters.

The project will involve developing a signal model framework which will be used for a likelihood-ratio test. Performance of the model will be compared against null hypothesis likelihood tests and a standard source-finding package such as SoFiA. The algorithm will be developed on simulated data and tested on data from one of the SKA precursor experiments.

References:
https://arxiv.org/abs/1501.03906

Proposed by: Daniel Forero (PhD), Jean-Paul Kneib (Faculty)
 
Type of project: Specialization [can be adapted for TPIVb]
 
Project flavor: Data analysis
 
Clustering measurements from the spatial distribution of tracers, obtained from large redshift surveys, allow us to constrain cosmological parameters. The end measurements however, tend to be noisy due to cosmic variance. We want to address this residual noise using Deep Learning techniques such that we are able to denoise spectra from one or a few realizations. This would potentially:
  • Reduce the number of cosmological volumes required to obtain smooth templates used for cosmological fits (i.e. void BAO)
  • Assess whether the denoising procedure is unbiased such that data SNR can be improved.

In this project the student will write and test different Deep Learning approaches to denoising the power spectra, starting by using convolutional autoencoders on the data available (though generating more data is possible). The student will address generalization issues and identify an appropriate training protocol (i.e. using clean target data vs. using noisy target data). If possible, the student may evaluate any bias the denoiser may add in terms of cosmological parameters.

References

Denoising: https://arxiv.org/pdf/1803.04189.pdf

Clustering measurements:

https://academic.oup.com/mnras/article/473/4/4773/4443205

https://arxiv.org/pdf/2007.09009.pdf

Proposed by: Richard Anderson (EPFL), Laurent Eyer (UniGE)
 
Type of project: Master project
 
The properties of the expanding Universe is an interesting and a very active field of research. Currently there is a “tension” on the determination of the Hubble constant, H0, between its estimations obtained from:
(a) the early universe measurements such as microwave background and
(b) more classical methods based on the establishment of a distance ladder with standard candles, such as the Cepheid pulsating stars and Supernovae.
 
Several other classical distance indicators have been used, among them the Tip of the Red Giant Branch (TRGB), which analysis seems to reduce slightly the “Hubble tension”. The principle of this tip is quite simple: the low mass stars evolve on the giant branch, burning their hydrogen in a shell and becoming brighter until suddenly the star core begins its helium burning which reshuffles the star entire structure. Observationally the star reaching the tip fades away to the horizontal branch. For a given galaxy, a colour magnitude diagram of all its stars conspicuously reveals that there is a maximum luminosity, the tip, for the stars of the red giant branch and making the assumption that this luminosity is similar in different galaxies makes this luminosity a distance indicator.
 
In this Master Project, you will work on improving the Tip of the Red Giant Branch as a distance tracer by including variability information it its localization. Ironically, often astronomers argue that this TRGB has the advantage on the other distance indicators that it is not requiring multi-epoch measurements. Here we will leverage times series data to improve one of the most precise methods for distance determination that is playing a major role in the understanding of the Hubble tension mentioned above. The study will be done for stars in our Galaxy (with European Space Agency mission Gaia data and Caltech project ZTF data), and in the Large and Small Magellanic Clouds (with Polish pioneering OGLE project data and Gaia data). Subgroup of stars will be assembled according to their variability properties (amplitudes, periods, period ratios, etc…) and a detailed analysis of their behaviours near the TRGB for these different galaxies will be done. Our goal is to recalibrate the TRGB with the method described above using DR2/EDR3 and prepare the analysis when Gaia DR3 will be available (first half of 2022), making this a very timely project.
 
References:
Proposed by: Cameron Lemon (Postdoc), Frédéric Courbin (Faculty)
 
 
Type of project: Master project
 

Gravitationally lensed quasars are valuable probes for studying astrophysics and cosmology. In particular, the magnification effect allows us to observe and study supermassive black holes only a few hundred million years after the start of the universe. The number and growth rates of these quasars has significant consequences for our understanding of the reionisation of the universe, and of black hole formation and growth mechanisms, however only a few systems are known.

This project will aim to find these elusive systems using a spectral energy distribution model-fitting approach, based on multi-wavelength broadband photometric data. Once candidates are identified, pixel modelling will be used to select the most promising systems for spectroscopic follow-up.

 
References:
Proposed by: Cameron Lemon (Postdoc), Frédéric Courbin (Faculty)
 
Type of project: Master project / CSE Project
 
Flavour: Observation / Data science

Gravitationally lensed quasars are of utmost importance for astropysics and cosmology. However, these systems must first be located on the sky. Of particular interest are quadruply imaged quasars (quads), however these are very rare, and often blended in ground-based datasets. This project proposes the use catalogues from the space-based telescope Gaia to find new quads (only ~40 are currently known), to then be spectroscopically followed-up and used for a variety of science goals: microlensing, galaxy mass measurements, time delay cosmography etc. The student will be allowed to propose their own investigations and selection methods, but will start with investigation of differences in parameters between the multiple Gaia data releases, analysing the results of known quads and common contaminant systems. The final result should be a selection algorithm to recover known lenses, and new potential candidates. Contact   for discussion/further details.

 
References:
 
Proposed by: Aymeric Galan (PhD), Frédéric Courbin (Faculty)
 
Type of project: Master project

When analyzing strong gravitational lenses, we need to assume specific models for the light and mass distribution of the lens galaxy, as well as a light model for the source galaxy. In the past few years, some studies and more recently the work by Cao et al. 2021 , have demonstrated limitations of the models currently being used, arguing that it is too simplistic. Even though it can reproduce observations, using such simple models can lead to strong biases on the inferred properties of the lens galaxies, and the value of the Hubble constant. In their work they simulated observations of gravitational lenses using spectroscopic data to obtain realistic galaxy mass profiles. These simulations have been publicly released by the authors here . The goal of this project is to apply a new modeling code (already developed) on this dataset that supports more complex mass models, and compare the resulting quantities with previous simpler models.

References:
 
Proposed by: Antonella Garzilli (Postdoc), Jean-Paul Kneib (Faculty)
 
Type of project: Master project

Dark matter is a universal phenomenon that can be observed from the galactic to the largest observable scales. We intend to investigate the nature of dark matter from the observation of the filaments that constitute the intergalactic medium. The intergalactic medium is observed in absorption in quasar spectra. We intend to train a neural network with mock data generated from cosmological numerical simulations from different cosmologies with cold and warm dark matter. We intend to investigate if the classical estimators used in the Lyman-alpha forest are optimal estimators or if other estimators could be considered.

References:

https://arxiv.org/abs/0711.3358

https://arxiv.org/abs/1809.06585

 
Proposed by: Antonella Garzilli (Postdoc), Jean-Paul Kneib (Faculty)
 
Type of project: Master project

The Universe is filled with a cosmic web of baryonic filaments since very high redshift, that is called the intergalactic medium. The intergalactic medium is largely observed in absorption in the spectra of distant and bright objects as quasars. It was speculated that the width of the absorbing lines in the quasar spectra is due to the spatial extent of the filaments. We aim to directly investigate this statement with the use of numerical simulations. The student will analyze the outputs of existing cosmological numerical simulations and compare with mock spectra extracted from the same simulations. estimators or if other estimators could be considered.

References:

https://arxiv.org/abs/0711.3358

https://arxiv.org/abs/1809.06585

 
Proposed by: Cheng Zhao (Postdoc), Jean-Paul Kneib (Faculty)
 
Type of project: Master project
 

Massive cosmological spectroscopic surveys provide a unique way to probe the 3D distribution of large-scale structures in the Universe, which can be used to measure the expansion history and growth of structures, thus constraining the Hubble parameter, as well as the properties of different cosmological components, such as the dark energy equation of state and neutrino mass.

As the product of the spectroscopic surveys, the database for the 3D positions of galaxies and quasars has been growing rapidly. In the past decade, the SDSS-III Baryon Oscillation Spectroscopic Survey (BOSS) and SDSS-IV extended BOSS (eBOSS) have measured millions of spectra. The ongoing Dark Energy Survey Instrument (DESI) is expected to take over 30 million spectra in 5 years. The next generation experiments, such as the MegaMapper, MUltiplexed Survey Telescope (MUST), and Spectroscopic Survey Telescope (SpecTel), are foreseen to increase the number of galaxy/quasar positions by another order of magnitude. Thus, we expect a factor of 10 gain on the figure of merit of cosmological constraints in the next 10-15 years over the current results.

In this project, we aim at performing a Fisher information matrix analysis of future large-scale spectroscopic surveys, to provide forecasts on various cosmological analyses, including the standard baryonic acoustic oscillation (BAO) and redshift-space distortion (RSD) measurements in the flat-ΛCDM framework, as well as constraints on alternative model parameters, such as the primordial non-Gaussianity, the curvature, neutrino mass, and the dark energy equation of state. To this end, we shall accomplish realistic estimations of the number of targets, properties of the targets, and sky area and redshift range coverage, based on the expected specifications of future surveys.

References:
 
Proposed by: Tianyue Chen (Postdoc), Jean-Paul Kneib (Faculty)
Type of project: Master project
 

We have come to an era of precision cosmology, where cosmic microwave background (CMB) measurements from the Planck satellite constrain the standard ΛCDM model within <1% accuracy. However, for the study of dark sector dominating in the late time Universe, observations at much lower redshift than the CMB are required. This can be achieved by measuring the large-scale-structures in the Universe through HI intensity mapping (IM), which maps the large-scale HI intensity fluctuations with the ultimate objective of detecting the baryonic acoustic oscillations (BAO) and constraining Dark Energy. Alternatively, gravitational lensing provides an independent method to probe the dark sector through, e.g., strong lensing time-delay measurements and weak lensing shear measurements.

The latest constraint on H0 through time-delay lensed quasars by the H0LiCOW collaboration [[i]] is consistent with that from the independent type Ia supernovae measurement [[ii]] under the standard ΛCDM cosmology. However, they are ~5 in disagreement with CMB and BAO measurements[[iii]] [[iv]]. To understand the high statistical significance of the H0tension, the student will combine HI IM with other probes including time-delay lensed quasars, CMB, BAO from optical surveys, and type Ia supernovae, in order to jointly constrain H0 and explore possible new physics that potentially resolve the H0 tension between early- and late-Universe probes. The project will involve Fisher matrix projection of SKA-like IM experiments, with prior information from other probes. Depending on the progress, the work can potentially extend to end-to-end simulations as the secondary step.

References:

Proposed by: Jennifer Schober
 
Type of project: Master/Specialization [can be adapted for TPIVb]
 
Project flavor: Theory/Numerical simulations
 

The Universe is permeated by magnetic fields, yet their origin is one of the greatest mysteries of cosmology. Could they have been generated already shortly after the Big Bang or are they only a property of the modern Universe? How do they evolve in cosmic history and how do they influence the formation of cosmic structures?

To answer these questions, astrophysical plasmas are often described with magnetohydrodynamics (MHD). However, at high energies, such as less than a second after the Big Bang or in hot young neutron stars, MHD necessarily needs to be extended to include additional electric currents caused by quantum anomalies that are related to the chirality of fermions. Recently, such ” chiral MHD” has been studied intensely with direct numerical simulations, which yield a very different evolution of magnetic fields as compared to the classical MHD case.
One difference is the possible occurrence of chiral magnetic waves, a spatial propagation of the asymmetry between left- and right-handed fermions, but also the propagation of Alfvén waves is modified in chiral MHD.

In this project, the student will perform direct numerical simulations with the Pencil Code to study the properties of chiral magnetic  waves. A set-up for a wave scenario will be provided and the student will explore new regimes of the parameter space. He or she will analyze the spatial propagation of waves, time series, and energy spectra.

URL/References:
Simulations of chiral waves [see Sections 4.3 and 4.4]:  https://ui.adsabs.harvard.edu/abs/2018arXiv180806624S/abstract
Theory of chiral MHD [see Section 5 for a discussion of the dispersion relation in chiral MHD]:  https://ui.adsabs.harvard.edu/abs/2017ApJ…846..153R/abstract
Pencil Code: http://pencil-code.nordita.org/

Proposed by: Richard Anderson (new faculty)

Type of project: Master/CSE project

Flavour: Observation/Data science

Distances are the fundamental quantity for a majority of astronomical inferences and pulsating stars allow us to measure very precise distances. However, unknown or poorly quantified systematic effects require urgent assessment because they could bias the most precise measurements of the expansion rate of the Universe. The stakes are high: if pulsating stars distances are correct, then cosmology may be missing fundamental physics. However, if the distances are wrong, then one of the most pressing astrophysical conundrums would be solved.
 
This project will use recently published data from the ESA space mission Gaia to investigate systematic distance uncertainties. First, we will mine Gaia’s database (1.8 billion objects) for pulsating stars in our Milky Way and in other galaxies. Second, we will collect available data from other sources. Using the combined dataset, we will investigate different sources of systematic distance errors due to astrophysical or instrumental effects.
 
More abstractly speaking, the project provides a deep dive into a brand-new large dataset collected in Space, offers a chance to work with unprecedented stellar time series observations collected by different methods, and will give an opportunity to contribute to forefront research in stellar astronomy and observational cosmology. 
 
References:

Proposed by: Jean-Paul Kneib (Faculty)

Type of project: Master/CSE project

Flavour: Observation/Data science

The new constellations of 5G communication satellites are threatening the Dark Sky as we expect a substantial increase of relatively bright satellites tracks in astronomical wide field observations.
 
The project will consist of improving current techniques of detection, identification and characterisation of satellites as well as Near-Earth-Orbit Objects, with the focus on using astronomical archival dataset. The idea is to develop a pipeline that would automatically analysed the new data available in public archive.
 
References:
https://arxiv.org/pdf/2003.01992.pdf
http://www.eso.org/~ohainaut/satellites/
https://www.esa.int/Safety_Security/Space_Debris/The_current_state_of_space_debris
http://astria.tacc.utexas.edu/AstriaGraph/
 
 

Proposed by: Cameron Lemon (Postdoc), Frédéric Courbin (Faculty)

Type of project: Master/CSE project

Flavour: Observation/Data science

Gravitationally lensed quasars are of utmost importance for astropysics and cosmology. However, these systems must first be located on the sky. One way to find such systems is to search the spectra of massive galaxies for signs of a background/higher-redshift quasar. If present there is a strong possibility that the background quasar is multiply-imaged. This project will aim to develop techniques to discover such systems in existing SDSS/BOSS spectra through model fitting and systematic testing of mocks, with forecasts for future spectroscopic surveys (such as 4MOST and Euclid). Promising candidates will be targetted for resolved spectroscopy. The project can also be extended to discovering massive galaxy absorption lines in the spectra of high-redshift quasars as possible lens candidates.

URL/References:

https://arxiv.org/abs/1810.04480 (see figure 8)
https://arxiv.org/abs/1604.01842
https://arxiv.org/abs/1110.5514

Proposed by: Richard Anderson (new faculty)

Type of project: Master/CSE project

Flavour: Observation/Data science

Most stars appear to be alone when observed through a telecope. However, a majority of stars actually resides in double, triple, or even higher-order multiple systems. Knowing whether pulsating stars are singles or binaries is crucial to accurately measuring their distances and for determining the cosmic distance scale. 
 
In this project, we will test a new method for detecting companion stars to Cepheid variable stars and for determining the binary/multiplicity fraction of these important objects. We will then estimate the impact of the companion stars on the distance measurements to the stars themselves.
 
More abstractly speaking, the project gives an opportunity to apply a well-described method to a new scientific problem. The student will work with the latest available observational data, some of it as yet unpublished, and will support ongoing research at the forefront of pulsating stars and distance measurements.
 
References:

Proposed by: Richard Anderson (new faculty)

Type of project: Master/CSE project (also adaptable for TP4b). Project flavour: Observations/Data Science

Distances are the fundamental quantity for a majority of astronomical inferences and pulsating stars allow us to measure very precise distances. However, unknown or poorly quantified systematic effects require urgent assessment because they could bias the most precise measurements of the expansion rate of the Universe. The stakes are high: if pulsating stars distances are correct, then cosmology may be missing fundamental physics. However, if the distances are wrong, then one of the most pressing astrophysical conundrums would be solved.
 
This project will use recently published data from the ESA space mission Gaia to calibrate distance measurement based on stars. First, we will identify the host clusters of pulsating stars using the latest data release by the ESA mission Gaia. Then, we will calibrate Leavitt’s law using the parallaxes of other cluster member stars. We can then apply the Leavitt law to measure distances to other galaxies, or to individual pulsating stars across our Milky Way.
 
More abstractly speaking, the project provides a deep dive into a brand-new large dataset collected in Space, offers a chance to work with unprecedented stellar time series observations collected by different methods, and will give an opportunity to contribute to forefront research in stellar astronomy and observational cosmology. 
 
References:

Proposed by: Martin Millon (PhD), Frédéric Courbin (Faculty)

Type of project: Master/CSE project

Flavour: Observation/Data science

Time-delay cosmology with strongly lensed supernovae is expected to provide a highly precise measurement of the expansion rate of the Universe, i.e. the Hubble Constant. This technique is based on the measurement of time-delays between the different images created by a strong gravitational lens. Although only 2 lensed supernovae have been discovered so far, future surveys such as LSST or ZTF are expected to discover dozens of such systems every year in the near future, improving the precision on the Hubble constant.

This project aims to adapt the current curve-shifting technique, developed for lensed quasars, to lensed supernovae and assess its reliability. The first part of the project will consist in producing mock lensed Supernovae light-curves before developing the method to analyze them. A precise and accurate measurement of the time-delays is critical as the errors propagate directly to the final estimate of the Hubble constant.

URL/References:

https://arxiv.org/abs/1802.01584

https://arxiv.org/abs/1805.04525

https://arxiv.org/abs/2002.05736

https://arxiv.org/abs/1208.5598

Proposed by:  Martin Millon (PhD) and Frédéric Courbin (Faculty)

Type of Project: TP4b or Master/CSE project

Project flavour: Data Science/Simulation

Abstract:

Microlensing is a unique tool to study the inner structure of Black Hole accretion disks (also called quasars). It is produced by stars passing in front of a background lensed quasar resulting in an extra amplification of its luminosity due to the gravitational lensing effect.

The aim of the project is to pursue the development of a Convolutional Neural Network (CNN) to analyze this signal. If successful on the training set, the method will be applied on real observation to obtain the first estimate of the size of a quasar accretion disk with a CNN. 

URL/References:

https://arxiv.org/pdf/1903.09170.pdf

https://arxiv.org/pdf/1205.4727.pdf

Proposed by: Ginevra Favole (Postdoc), Jean-Paul Kneib (Faculty)

Type of project: Master project
Project flavour: simulations/galaxies
 
The sub-halo abundance matching (SHAM) and the halo occupation distribution (HOD) model are the most used techniques to populate dark matter haloes in N-body simulations with observed galaxies. These prescriptions allow us to link the dark and the visible sectors of our Universe, directly probing the stellar-to-halo-mass relation. By coupling SHAM and HOD with large-volume high-resolution cosmological simulations, we are able to generate accurate galaxy mock catalogues, which we then use to model the galaxy clustering and weak lensing signals from galaxies. These are the two cosmological probes that ongoing and future surveys such as DESI, Euclid, 4MOST or LSST will provide us.

This project aims to adapt existing SHAM and HOD software scripts to new N-body products and make them publicly available for future cosmological analyses on GitHub.

 
URL/References:
https://ui.adsabs.harvard.edu/abs/arXiv:astro-ph%2F0408564

https://ui.adsabs.harvard.edu/abs/2006ApJ…647..201C/abstract

https://ui.adsabs.harvard.edu/abs/2016MNRAS.461.3421F/abstract

Proposed by: Antonella Garzilli (Postdoc), Jean-Paul Kneib (Faculty)

Type of project: Master project
Project flavour: Numerical methods
 

Dark matter is a universal phenomenon that can be observed from the
galactic to the largest observable scales. We intend to investigate
the nature of dark matter from the observation of the filaments of
baryons that constitute the intergalactic medium. The intergalactic
medium is observed in absorption in quasar spectra. We intend to train
a neural network with mock data generated from cosmological numerical
simulations from different cosmologies with cold and warm dark
matter. We intend to investigate if the classical estimators used in
the Lyman-alpha forest are optimal estimators or if other estimator could be considered .

URL/References:

https://arxiv.org/abs/0711.3358

https://arxiv.org/abs/1809.06585

Proposed by: Antonella Garzilli (Postdoc), Jean-Paul Kneib (Faculty)

Type of project: Master project
Project flavour: Numerical methods
 

The Universe is filled with a cosmic web of baryonic filaments since
very high redshift, that is called the intergalactic medium. The
intergalactic medium is largely observed in absorption in the spectra
of distant and bright objects as quasars. It was speculated that the
width of the absorbing lines in the quasar spectra is due to the
spatial extent of the filaments. We aim to directly investigate this
statement with the use of numerical simulations. The student will
analyze the outputs of existing cosmological numerical simulations and
compare with mock spectra extracted from the same simulations.

URL/References:

https://arxiv.org/abs/0711.3358

https://arxiv.org/abs/1502.05715

Proposed by: Aymeric Galan (PhD), Frédéric Courbin (Faculty)
Type of project: Master/CSE project
Flavour: data science, statistics, image processing, neural networks

Modelling strongly lensed systems is the angular stone of a wide range of topics. There are three main components of a lens model: the mass distribution of the lensing galaxy, and the surface brightness distribution of the source galaxy. On top of a number of degeneracies affecting the modelling, it is sometimes hard to pin-point which component of the model is at fault when looking at the residuals (= the difference between model and data). Additionally, in the light of future large samples of data, it is important to design automated procedures to replace current visual inspection. The goal of this project is to simulate a suite of residuals maps and train a neural network to classify these maps depending on the missing component in the model.

References:

https://arxiv.org/pdf/0805.0201.pdf, https://arxiv.org/pdf/1807.09278.pdf, https://arxiv.org/pdf/1806.07897.pdf 

Proposed by: Austin Peel (postdoc), Aymeric Galan (PhD), Frédéric Courbin (Faculty)
Type of project: Master/CSE project
Flavour: data science, statistics, differentiable programming, neural networks

Neural networks are more and more employed in strong lens modelling. In this project, the goal is to train a variational autoencoder (VAE) to learn the surface brightness of typical lensed galaxies. A potential improvement might be introduced by using the recently established « disentangled VAE » architecture. With such networks, the goal would be to extract specific features from the source galaxy, such as size, orientation and ellipticity, directly from the abstract space defined by the VAE. Furthermore, using the differentiable programming framework JAX, the VAE could be included in a larger modelling pipeline in a modular way.

References:

https://jax.readthedocs.io/en/latest/

https://arxiv.org/pdf/1910.06157.pdf, https://arxiv.org/pdf/1709.05047.pdf, https://arxiv.org/pdf/2005.12039.pdf

Proposed by: Karina Rojas (Postdoc), Frédéric Courbin (Faculty)
Type of project: Master/CSE project
Flavour: Numerical modeling, statistics, image processing

We are currently finding strong gravitational lenses in the CFIS (Canada-France Imaging Survey) in a an effort to prepare the ESA Euclid mission. These will be used as a test bench to perform massive and fast mass modeling of galaxies with the “lenstronomy” software developed in Stanford and in part at EPFL. The work will be to implement as reliable modeling scheme and to provide a new sample of lenses with exquisite ground-based imaging. Eventually, some these developments will be applicable to the Euclid space telescope to be launched in 2022. 

URL: http://www.cfht.hawaii.edu/Science/CFIS/

Proposed by: Benjamin Clément (Postdoc), Frédéric Courbin (Faculty)
Type of project: Master/CSE project
Flavour: data science, statistics, image processing, cosmology

It is proposed to design a machine learning algorithm to find giant lensed arcs in galaxy clusters. The algorithm, if successful, will set the basis to design the arc-finding algorithm for the ESA Euclid space telescope to be launched in 2022. But the present will be based on existing sharp and deep Hubble images of galaxy clusters known as the “Hubble Frontier Fields”. The work will involved the design of a simulation chain to help training neural networks and will be applied to the Hubble image to validate its performances.

URL: https://archive.stsci.edu/prepds/frontier/

https://fr.wikipedia.org/wiki/Euclid_(télescope_spatial)

Proposed by: Cameron Lemon (Postdoc), Frédéric Courbin (Faculty)
Type of project: Master/CSE project
Flavour: data science, statistics, image processing, cosmology

Gravitationally lensed quasars are of utmost importance for cosmology. Measuring their time delays can indeed provide an independent measurement of the Hubble constant provided enough systems are studied. These can be found using machine learning applied to massive ground-based surveys such as CFIS, DES, PANSTARRS, etc. The present work will involve the construction simulated training sets and the design of CNNs to look for new lensed quasars in existing ground-based surveys. Depending on success, this work may extend to the ESA Euclid space telescope.

URL: https://fr.wikipedia.org/wiki/Euclid_(télescope_spatial)

https://www.spacetelescope.org/images/heic1702d/

https://www.nasa.gov/feature/goddard/2020/cosmic-magnifying-glasses-yield-independent-measure-of-universes-expansion

Proposed by: James Chan (Postdoc), Frédéric Courbin (Faculty), and Cameron Lemon (Postdoc)

Type of Project: TP4b or Master/CSE project

Project flavour: Observation/Data Science

Strong gravitationally lensed quasars provide powerful means to study galaxy evolution and cosmology. There have been several undertakings to look for them in various surveys. In this project, we will apply an existing algorithm, CHITAH, to search for more lensed quasars in the field of CFIS. CFITS is a legacy survey for the Canadian and French communities. It provides excellent image quality with ~5,000 square degrees. We aim at new lensed quasar candidates in various surveys, in particular CFIS. This project requires coding skills and interest in imaging analysis.

URL/References:

http://www.cfht.hawaii.edu/Science/CFIS/

https://arxiv.org/abs/1411.5398

https://arxiv.org/abs/1911.02587

Proposed by: Yves Revaz (Faculty)  and Loic Hausammann (PhD)

Type of Project: Master
Project flavour: Simulation / Galaxies

The Milky way is known to impact the evolution of its satellites through two main mechanisms: the tidal and ram pressure stripping.

Both have been studied for a long time, but no one has looked at the impact of the radiation emitted by the Milky Way. This radiation (mostly in the UV band) can heat the cold gas and is supposed to facilitate its stripping through the ram pressure.

This project implies developing a model for the Milky Way’s UV based on observations and running simulations of dwarf galaxies that include the UV through the moving box technique.

This technique allows simulations of satellite dwarf galaxies to be run at very high resolution while still taking into account correctly the Milky Way.

URL / References:

https://arxiv.org/abs/1902.02340

http://arxiv.org/abs/1410.3829

This project is in practice at University of Geneva at the Observatoire de Sauverny.

Proposed by Vincent Bourrier and David Ehrenreich (Unige) – Frédéric Courbin (Contact EPFL faculty).

Type of Project: Master/CSE project

Project flavour: Observation/Data Science

The Unige exoplanet group in the Geneva Observatory offers an internship in the frame of the Exoplanets on the Edgeprogram, carried out with the HARPS-N spectrograph on a sample of transiting exoplanets at the border of the “Neptunian desert”.

This mysterious feature is a lack of Neptune-size exoplanets orbiting very close to their star, which could be linked to the way such planets migrated from their original birthplace far from the star to their present location. This hypothesis can be tested by analyzing the properties of exoplanets observed today at the border of the desert, as their orbital architecture (the shape of their orbit and its orientation with respect to the star) contains the imprint of their past migration.

As a planet transits in front of its star, it leaves a specific spectroscopic signature (the so-called Rossiter-McLaughlin effect) whose properties depend on the planet’s orbital architecture. A high-resolution spectrograph like HARPS-N, devised in Geneva and installed on a 4-m class telescope, is an ideal tool to search for these signatures and get precious clues about the origins of the Neptunian desert.

The internship consists of the analysis of the HARPS-N data, the extraction of the Rossiter-McLaughlin signals, and when relevant their interpretation with a migration model currently in development within our group.

mich