Format

Send to

Choose Destination
Front Neuroinform. 2014 Oct 10;8:78. doi: 10.3389/fninf.2014.00078. eCollection 2014.

Spiking network simulation code for petascale computers.

Author information

1
Simulation Laboratory Neuroscience - Bernstein Facility for Simulation and Database Technology, Institute for Advanced Simulation, Jülich Aachen Research Alliance, Jülich Research Centre Jülich, Germany ; Programming Environment Research Team, RIKEN Advanced Institute for Computational Science Kobe, Japan.
2
Institute of Neuroscience and Medicine (INM-6), Institute for Advanced Simulation (IAS-6), Jülich Research Centre and JARA Jülich, Germany.
3
Institute of Neuroscience and Medicine (INM-6), Institute for Advanced Simulation (IAS-6), Jülich Research Centre and JARA Jülich, Germany ; Department of Mathematical Sciences and Technology, Norwegian University of Life Sciences Aas, Norway.
4
Advanced Center for Computing and Communication, RIKEN Wako, Japan.
5
Neural Computation Unit, Okinawa Institute of Science and Technology Okinawa, Japan ; Laboratory for Neural Circuit Theory, RIKEN Brain Science Institute Wako, Japan.
6
Integrated Systems Biology Laboratory, Department of Systems Science, Graduate School of Informatics, Kyoto University Kyoto, Japan.
7
Laboratory for Neural Circuit Theory, RIKEN Brain Science Institute Wako, Japan.
8
Simulation Laboratory Neuroscience - Bernstein Facility for Simulation and Database Technology, Institute for Advanced Simulation, Jülich Aachen Research Alliance, Jülich Research Centre Jülich, Germany ; Institute of Neuroscience and Medicine (INM-6), Institute for Advanced Simulation (IAS-6), Jülich Research Centre and JARA Jülich, Germany ; Faculty of Psychology, Institute of Cognitive Neuroscience, Ruhr-University Bochum Bochum, Germany.
9
Institute of Neuroscience and Medicine (INM-6), Institute for Advanced Simulation (IAS-6), Jülich Research Centre and JARA Jülich, Germany ; Laboratory for Neural Circuit Theory, RIKEN Brain Science Institute Wako, Japan ; Medical Faculty, RWTH University Aachen, Germany.
10
Programming Environment Research Team, RIKEN Advanced Institute for Computational Science Kobe, Japan ; Institute of Neuroscience and Medicine (INM-6), Institute for Advanced Simulation (IAS-6), Jülich Research Centre and JARA Jülich, Germany.

Abstract

Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today.

KEYWORDS:

computational neuroscience; large-scale simulation; memory footprint; memory management; metaprogramming; parallel computing; supercomputer

Supplemental Content

Full text links

Icon for Frontiers Media SA Icon for PubMed Central
Loading ...
Support Center