Galformod Project

About Galformod

The Galformod project is made possible by
Advanced Grant 246797 GALFORMOD
from the European Research Council.

The Galformod Project (2009-2014)

Over the next decade, much of the effort on major astronomical facilities will be dedicated to large scale surveys on the galaxy population. Their aim is two-fold:
  • understanding the origin and evolution of galaxies and their central supermassive black holes,
  • clarifying the nature of dark matter, dark energy and the process that produced all cosmic structure.
The goal of Galformod is to develop and release powerful and flexible modeling tools that can simulate the evolution of the galaxy population in all viable cosmologies and under a wide variety of assumptions about the governing physical processes. This will be achieved by a major expansion of the functionality and scope of the Millennium Simulation Archive.
The original Millennium Simulation will be complemented by a higher resolution simulation (the Millennium-II) and a simulation of a much larger volume (the Millennium-XXL) and the detailed evolution of nonlinear structure in these simulations will be characterised in the form of fine-grained subhalo merger trees and made publicly available as queriable databases.
New techniques will allow these merger trees to be scaled in space, mass, velocity and time to represent the growth of dark matter structure from gaussian initial fluctuations in any currently viable cosmology. The trees then provide the backbone for simulation of the formation and evolution of the galaxy population in that cosmology.
In a first phase galaxy catalogues will be made publicly available both in "snapshot" and in "light-cone" form for several different galaxy formation models and in several different cosmologies. In a second phase users will be able to adjust the physical parameters of the galaxy formation model and of the background background, in order to explore how such changes affect the visible properties of the galaxy population. In a final phase, users will be able to change the assumptions underlying the galaxy formation modelling by programming their own modules and inserting them into the simulation pipeline.

Millennium Simulations

The original Millennium Run, published in 2005, used more than 10 billion particles to trace the evolution of the matter distribution in a cubic region of the Universe over 2 billion light-years on a side. It kept busy the principal supercomputer at the Max Planck Society's Supercomputing Centre in Garching, Germany for more than a month. By applying sophisticated modelling techniques to the 25 Tbytes of stored output, Virgo scientists have been able to recreate evolutionary histories both for the 20 million or so galaxies which populate this enormous volume and for the supermassive black holes which occasionally power quasars at their hearts. By comparing such simulated data to large observational surveys, one can clarify the physical processes underlying the buildup of real galaxies and black holes.
The Millennium-II Simulation, published in 2009, used the same cosmology, the same output structure and the same number of simulation particles as the Millennium Run, but followed the growth of structure in a region of five times smaller linear size, resulting in 125 times better mass resolution. A combination of the two simulations allows the formation of the galaxy population to be followed consistently over the full range of scales from the tiny dwarfs cvisible only in the outskirts of our own Milky Way to the giant cD galaxies seenat the centres of rich galaxy clusters.
The Millennium-XXL Simulation, completed in 2010 but with first detailed results being published in 2012, again has the same cosmology and output structure as the original Millennium Run, but uses over 300 billion particles to follow the growth of structure in a region of six times larger linear size. The goal here is to follow evolution throught a volume comparable in size to that of the largest current observational surveys.