Infrastructure

Main components of the infrastructure

The Blue Brain workflow depends on a large-scale computing and data infrastructure, providing:

  • An IBM 65,536 core Blue Gene/Q supercomputer for modeling and simulation (hosted at CSCS), providing extended capabilities for data-intensive supercomputing (BlueGene Active Storage);
  • A 40 node analysis & visualization cluster
  • An OpenStack/Ceph private cloud running in two regions
  • Different storage systems for data archiving and neuroinformatics
  • A modern continuous integration, collaborative software development platform

In collaboration with EPFL’s Laboratory for Neural Microcircuitry (LNMC):

  • State of the art technology for the acquisition of data on different levels of brain organization (multi-patch clamp set-ups for studies of the electrophysiological behavior of neural circuits, Multi-electrode Arrays – (MEAs) allowing stimulation of and recording from brain slices, facilities for the creation and study of cell lines expressing particular ion channels, a variety of imaging systems, systems for the 3D reconstruction of neural morphologies).

Data

The success of the Blue Brain project depends on very high volumes of standardized, high quality experimental data, covering all possible levels of brain organization. Data comes from the literature (via the project’s automatic information extraction tools), from the Human Brain Project, from large data acquisition initiatives outside the project, and from the EPFL’s Laboratory for Neural Microcircuitry (LNMC).

High Performance Computing

The Blue Brain workflow creates enormous demands for computational power. In Blue Brain cellular level models, the representation of the detailed electrophysioloy and communication of a single neuron can require as many as 20,000 differential equations. Even modern multi-core workstations have difficulties of solving this number of equations in biological real time. In other words, the only time to ensure reasonable turn-around times for simulating networks of neurons is the use of High Performance Computing (HPC).

The Blue Brain project’s simulation of the neocortical column incorporates detailed representations of at least 30,000 neurons – in most case several times this number in order to provide valid boundary conditions. A simulation of a whole brain rat model at the same level of detail would represent up to 200 million neurons and would require approximately 10,000 times more memory. Simulating the human brain would require yet another 1,000-fold increase in memory and computational power. Subcellular modeling, modeling of the neuro-glial vascular system and the creation of virtual instruments (e.g. virtual EEG, virtual fMRI) will further expand these requirements.

In the initial phase of its work the Blue Brain project used an IBM BlueGene/L supercomputer with 8,192 processors. Some years ago, it used a 16,384 core IBM BlueGene/P supercomputer with almost 8 times more memory than its predecessor. Today, it uses an IBM BlueGene/Q supercomputer with 65,536 cores and extended memory capabilities hosted by the Swiss National Supercomputing Center (CSCS) in Lugano.