EPFL’s Blue Brain Project is pleased to announce the open sourcing of RTNeuron. Originally, one of Blue Brain’s first domain specific interactive visualization tools, the latest version of RTNeuron provides a scalable real-time rendering tool for the visualization of detailed neuronal simulations based on cable models. This allows the visualization of neurons, synapses and playback of simulation data as well as other basic geometrical objects.
30 August 2018
The aim of the Blue Brain Project is to build accurate, biologically-detailed, digital reconstructions and simulations of the rodent brain. The supercomputer-based reconstructions and simulations built by the project offer a radically new approach for understanding the multi-level structure and function of the brain.
Blue Brain’s data-driven modelling cycle allows the project to build digital reconstructions of the rodent brain at an unprecedented level of biological detail and RTNeuron was originally conceived as a standalone application for visualizing simulation results. It has now evolved into a framework for the interactive visualization and media production of simulation results for detailed neuronal network simulations. The latest version allows the visualization of neurons, synapses and playback of simulation data as well as other basic geometrical objects (for example spheres, lines, meshes). It also provides some advanced rendering capabilities such as order-independent transparency, parallel rendering and custom technique for fast rendering of neuron-like geometry.
RTNeuron consists of an OpenGL rendering backend written as a C++ library, a Python wrapping and a Python command line application. GUI overlays can be created for specific use cases using PyQt and QML. Already existing power applications allow the inspection of circuit connectivity, virtually slicing and dying a circuit and playing back simulation data. The tool also makes uses of the Equalizer framework for rendering. This means that it is possible to take advantage of multiple GPUs or even GPU-based clusters to speed up the rendering using parallel rendering techniques. The latest version leverages recent changes in Blue Brain’s open source C++ library Brion to provide basic support for the open SONATA file format for representing circuits and simulations, jointly developed by the Blue Brain Project and the Allen Institute for Brain Science.
Blue Brain’s Computing Director, Prof. Felix Schürmann commented; “We recognize that knowledge sharing through open sourcing is an important driving force behind scientific progress. The latest version of RTNeuron as a real time rendering tool is important for our scientific progress at Blue Brain and we hope that the neuroscientific community will also benefit from it now being open sourced. In particular, making available to the community advanced capabilities to interactively and visually inspect structural and functional features of brain tissue models, and simplify the generation of high quality movies and images for presentations and publications is a great addition to the ecosystem of models and tools around the SONATA format”.
RTNeuron user, Prof. Michele Migliore of the Italian National Research Centre – “RTNeuron is an invaluable tool in my everyday work. Its ease of use, flexibility, and very efficient implementation make it a wonderful way to visualize and explore both the static and the dynamical properties of the large-scale cellular model circuits I am implementing and investigating.”
RTNeuron is available under the GLPv3 license, at https://github.com/BlueBrain/RTNeuron
More information on the open SONATA file format for representing circuits and simulations now supported by RTNeuron can be found at https://github.com/BlueBrain/sonata
Contributors and Funding
RTNeuron has been jointly developed by the EPFL Blue Brain Project and the Universidad Politécnica de Madrid. Main financial support was provided by the ETH Board funding to the Blue Brain Project and Cajal Blue Brain (funded by the Spanish Ministerio de Ciencia, Innovación y Universidades). Partial funding has been furthermore provided by the European Union’s Horizon 2020 research and innovation programme under grant agreement no.720270. (HBP SGA1)