TVB-NEST
EBRAINS The Virtual Brain - NEST cosimulation
Concept
This is a proof of concept for the co-simulation between TVB and NEST.
It should be flexible and scalable to adapt to any networks simulation and run in supercomputers.
Table of Contents
- Installation and update
- Advice for using the repository
- Different Installation
- Example
- Mouse Brain
- Dependencies
- Nest
- MPI
- Python library
- Adaptation to your own usage
- The management of parameters or change parameters
- The modification of translation
- The modification of Nest configuration
- The modification of TVB configuration
- Tests
- How to test the installation
- Cluster
- Deepest
- Future implementation
- Extension
- Files
Installation and update :
Advice for using the repertory :
For cloning the project, you also need to clone the submodule (git submodule). \
I advise you to clone the repertory with the following command : git clone --recurse-submodules
For updating the folder, you also need to update the submodule by the following command: git submodule update.
Different Installation :
In the folder installation, you have three possibilities :
- docker :
The script '/install/deep/create_container_X.sh' is for creating a docker image from the configuration file '/install/deep/Nest_TVB_config_X.dockerfile'. The configuration 1 use is based on alpine distribution (1) and the configuration 2 is based on debian distribution (2).\
The script '/install/deep/run_image.sh' uses the test 'tests/run_co-sim_test_docker.py'.
- singularity :
The folder contains the same file as docker except for testing the image.\
The configuration full (3) is to create an image for running the example from jupyter notebook.(example)
- virtual environment :
The script 'create_virtual_python.sh' creates a virtual environment in the current path with all the python library for running the script.
WARNING : the library for PyNest is not available in this environment. You should compile Nest and add the path of the library in the PYTHONPATH before to use it.
- installation on DEEP-EST :
The script 'install/deep/install.sh' is for the installation in the cluster Deepest. (see the section cluster)
- local installation : the script is only an example of local installation in an environment where you don't have super user right.
WARNING : It should not be used in your computer.
If you install in your computer, you should look at the configuration file of the singularity configuration or the local installation.\
The installation is not for the moment standardize.\
For testing your installation, you should look at the section Test or the folder tests.
Example
In the folder example, you can find jupyter notebook for having a demonstration of this project and some result of the simulation.
The result of the simulation can help you to understand the structure of folders for saving the output of simulation and to prepare your visualization.
The structure of the result is defined in the orchestrator file, it can be modified but this organization helps to separate the different modules of the simulation and the logs.
The example is based on the application of the framework. For the moment, there is only one application.
Mouse Brain
This example is a demonstration of the full-mouse brain with 2 regions simulate with Nest.\
For running this example, you need to create the full image of singularity which contains all the modules for the jupyter notebook.
Once you have this image, you should launch the next command of the folder example:
singularity run --app jupyter-notebook ../install/singularity/Nest_TVB_full.simg
The jupyter home page will start in your browser and you can launch the jupyter notebook.
This example is composed of 4 parts:
- The explanation of all the parameters of the application
- The running part of the application
- The display of the result of the simulation (short simulation)
- The display of the result of a long simulation
Dependencies
Nest
The version of Nest is base of Nest 3 is already included in the repertory\
For the dependency of Nest, you should look at this page : https://nest-simulator.readthedocs.io/en/stable/installation/linux_install.html \
For having the correct configuration parameters of Nest, you should look at this page : https://nest-simulator.readthedocs.io/en/stable/installation/install_options.html \
I use the next commands for compiling nest : \
- mkdir ./lib/nest_run \
- mkdir ./lib/nest_build \
- cd ./lib/nest_build \
- cmake ..nest-io-dev/ -DCMAKE_INSTALL_PREFIX:PATH='../lib/nest_run/' -Dwith-python=3 -Dwith-mpi=ON -Dwith-openmp=ON -Dwith-debug=ON \
- make
MPI
MPI need to be MPI-3 ( I use the following implementation : http://www.mpich.org/static/downloads/3.1.4/mpich-3.1.4.tar.gz)
Python library
The script 'install/py_venv/create_virtual_python.sh' is to install all python dependency in a virtual environment in the current folder. (see installation section)
Python library :
- nose
- numpy
- cython
- Pillow
- scipy
- elephant
- mpi4py
- tvb-library version >=2.0
- tvb-data version >=2.0
- tvb-gdist version >=2.0
Adaptation to your own usage
The management of parameters or change parameters:
The file 'nest_elephant_nest/orchestrator/parameters_managers.py' which manages the modification of parameters of the exploration and links parameters.
If you change a parameter, you should be careful of the dependency between parameters ( see the file 'example/parameter/test_nest.py' ).
If you explore a particular parameter, you should be careful of the name of parameters and the real modification of it.
If you modify the translation part :
The translation function is simple to change the function.
In the files, 'nest_elephant_nest/translation/science_tvb_to_nest.py' or 'science_nest_to_tvb.py' contains all the functions for exploring different solution of translations.
The translation tvb to nest is composed of one function :
-
generate_spike :
This function takes in input the rates of TVB and generates an array with all the input spike for each neuron.
Actually, the function is a generation of spike with a generator of inhomogeneous Poisson (implemented in elephant).
The parameter is the percentage of rates shared between all neurons. (It's a very simple translation.)
The translation nest to TV bis composed of two functions : (see the folder documentation)
- add_spikes :
This function takes an array of spike and store it in a buffer.
The actual function store the spike in a histogram
- analyse:
This function takes the buffer and transforms in rates.
Actually, the model is based on a sliding window of the histogram. The analyse generate rate with a sliding window.
If you modify the Nest configuration:
You should create a new file in 'nest_elephant_nest/Nest/ ' and modify the orchestrator and the script 'nest_elephant_nest/Nest/run_mpi_nest.sh'.
The orchestrator needs to have access to 2 files for the co-simulation : spike_generators.txt and spikes_detectors.txt . These files contain the ids of devices for the connection with Nest and there are generate by the configuration of Nest.
If you need an example, you can look 'nest_elephant_nest/Nest/simulation_Zerlaut.py'.
The example is composed of functions for the configuration of the simulator and the running functions at the end.
There are 5 steps :
- Configure kernel of Nest
- Create the population of neurons
- Create the connection between and inside population
- Create the device (WARNING: this function need to send the ids of device using MPI. These ids are used for the configuration of the translators.)
- Simulate
The first 4 steps are the initialization of Nest.\
If you include or remove the parameters of Nest, you need to change 'nest_elephant_nest/simulation/parameters_managers.py' to include or remove the link between parameters.
Moreover, the name of the section of parameters need to begin by 'param'.
If you modify TVB configuration:
You should change the beginning of the file nest_elephant_nest/TVB/simulation_Zerlaut.py .
This file contains the initialization and configuration of TVB in the function init. All the parameters are used in this file.
The other function is the wrapper for I/O communication with MPI and the launcher of the simulator TVB.
The dependency with the parameter of Nest is defined in 'nest_elephant_nest/orchestrator/parameters_managers.py'.
There exist two types of monitors depending on your use of the model TVB. Interface_co_simulation.py is to include a measure from Nest inside the model. Interface_co_simulation_parallel.py is to replace the coupling variable by a measure from Nest.
Tests
How to test the installation (they don't check the correctness of the simulation)
-
Test the Nest I/O :
- 'tests/test_input_nest_current.sh'\
The test is for the step current generator with mpi communication. The array at this end of the output is the event record by a spike detector.
If there is an event from the neurons 1, this meaning the device impact the dynamic of the neuron.
- 'tests/test_input_nest_current_multiple.sh'\
The test is for the step current generator with mpi communication in different parameterization of Nest. The parameter testing is for testing the case of use only threading, use only MPI process or a mix of the two.
The checking is the same then previously for each test. The time of each spike events between runs need to be different.
- 'tests/test_input_nest_spike.sh'\
The test is for the device spike generator with mpi communication. The two arrays at this end of the output is the event record by a spike detector.
The first array is a recording of the neurons 1. If there are events, this means the spike generator doesn't have an impact on the neurons.
The second array is a recording of the two spike generator itself. If there is no event from the id 7, this mean there will be a problem. We can compare the value of the array with the data send previously.
- 'tests/test_input_nest_spike_dict.sh'\
It's the same test as previously, except the definition of the parameters is done before the creation of the device.
- 'tests/test_input_nest_spike_multi.sh'\
It's the same test as previously, except that Nest use multithreading.
- 'tests/test_input_nest_spike_dict.sh'\
It's the same test as previously, except that Nest use multithreading.
- 'tests/test_record_nest_spike.sh'\
The test is for testing the spike detector and the spike generator with mpi communication. It's just to test it's possible to run it.
- 'tests/test_record_nest_spike_multiple.sh' (also test the multithreading)
The test is for testing the spike detector and the spike generator with mpi communication in different parameterization of Nest. The parameter testing is for testing the case of use only threading, use only MPI process or a mix of the two.
It's just to test it's possible to run it.
- 'tests/test_nest_MPI_threading/run_test.sh'\
This script is based on a generic test for testing the multithreading of nest. However it's too complex but it seems to test correctly Nest.
-
Translation:
- 'tests/test_translator_tvb_to_nest.sh'\
This test is for the module translation between Tvb to Nest.
You can check if it succeeds to exist without error (exit of Nest Input and TVB output and the end of the processes).
You can check if they receive size and the array are different to zero for Nest INPUT. Nest Input (x) needs to be different.
- 'tests/test_translator_nest_to_tvb.sh'\
This test is for the module translation between Nest to TVB.
You can check if it succeeds to exist without error (exit of Nest Output and TVB input and the end of the processes).
The second evaluation is by looking at the alternation between TVB INPUT and NEST OUTPUT.
The third check is the right value of Nest Output need to be different to zeros (number of spikes receive * 3)
- 'tests/test_nest_save.sh'\
This test is for the module translation between Nest save. The implementation of this translator reuse the interface with Nest from Nest to TVB.
You can check if it succeeds to exist without error (exit of Nest Output and the end of the processes).
-
Orchestrator:
- 'tests/run_tvb_one.py' : run exploration 2D of TVB only with one region
- 'tests/run_tvb.py' : run exploration 2D of TVB only with the test parameter
- 'tests/run_nest_one.py' : run exploration 2D of Nest only with one region
- 'tests/run_nest.py' : run exploration 2D of Nest only with the test parameter
- 'tests/run_nest.py' : run exploration 2D of Nest with translator for saving the mean firing rate
- 'tests/run_nest_co-sim.py' : run exploration 2D co-simulation with test parameter
- 'tests/run_nest_co-sim_test.py' : file using by test_co-sim.sh
-
Test the co-simulation:
- 'tests/test_co-sim.sh'\
This test is for the application. It tests the co-simulation in different parameterization of Nest. The parameter testing is for testing the case of use only threading, use only MPI process or a mix of the two.
The success of these tests is to arrive at the end without any errors.
-
For testing the co-simulation in a container : (see installation for the creation of image)
- 'install/docker/test_image.sh'
- 'install/singularity/test_image.sh'
These tests are based on the script 'tests/run_nest_co-sim_test.py'. They need a parameter to choose the image to test.\
For docker, 0 is for alpine distribution (local:NEST_TVB_IO) and 1 is for the debian distribution (local:NEST_TVB_IO_2).\
For singularity, 0 is for the full image (Nest_TVB_full.simg), 1 is for alpine distribution (Nest_TVB_1.simg) and 2 is for debian distribution (Nest_TVB_2.simg).
Cluster
The option for using the project in a cluster is based on the system manager slum. If you have another system manager, you need to modify all 'nest_elephant/orchestrator/run_exploration.py'.
For the moment, it's impossible to add options to slurm but it's easy to add this feature.
WARNING: this project is only tested on DEEPEST.
DEEPEST :
The installation and the option for the cluster are built for DEEPEST (https://www.deep-projects.eu/).
The file 'install/deep/install.sh' is to compile and install all the python library in the folder lib for the usage in the cluster.
For testing the installation, you need to change the file /tests/init.sh. The paraemter CLUSTER and DEEPEST need be change from 'false' to 'true'.
Future implementation :
- Add test for the input and output of interface TVB.
- Add test for validating values of a simulation
- Add some functions of the simulation Nest to include multimeter recorder and other stimuli.
- Refractor the code to avoid the same piece of code in different file ( example : the creation of a logger, ...)
Extension :
- Improve the orchestrator for managing communication of MPI and the synchronization between all processes
Files
- doc: Documentation of the project
- UML: UML of communication and state of modules
- example: an example of the usage of the proof of concept
- analyse: the function for analysis of the output of examples
- parameter: parameter for the simulation and the data
- data_mouse: data from Allen Institute of mouse connectome is composed of the file of the distance between each region and the weight of the connections
- test_nest.py: parameter for testing the installation
- short_simulation: folder of the result of the short simulation
- log: folder contains all the log of the simulation
- nest: generated files by the Nest module
- labels.csv: the label of the recorder and the type of recording
- population_GIDs.dat: the id of the neurons and the type of neurons
- spike_detector.txt: the id of spike detector
- spike_generator.txt: the id of spike generator
- other files : the output of the spike detectors and multimeters
- translation:
- receive_from_tvb: contains the MPI port for the connection of TVB to the translator during the simulation
- send_to_tvb: contains the MPI port for the connection of TVB to the translator during the simulation
- spike_detector: contains the MPI port for the connection of Nest to the translator during the simulation
- spike_generator: contains the MPI port for the connection of Nest to the translator during the simulation
- tvb: generated files by TVB modules
- step_*.npy : the output of the monitor of TVB
- step_init.npy: the initialisation value of the node in TVB
- init_rates.npy: initialisation of the rate from Nest to TVB
- init_spikes.npy: initialisation of spikes from TVB to Nest
- parameters: parameters for the simulation
- long_simulation: same result but for a longer simulation
- demonstration_mouse_brain.ipynb: jupyter notebook for running an example of the application
- install
- deep
install.sh: installation on deepest cluster
Should be installed in /p/project/type1_1/[personaldir]
- docker
create_docker.sh, create_docker_2.sh: create the docker image for the project
Nest_TVB.dockerfile, Nest_TVB_2.dockerfile: file of configurations for docker
run_images: example of running co-simulation with the image
- py_venv
create_virtual_python.sh: create the virtual environment
- singularity
create_container.sh, create_container_2.sh: create the singularity image for the project
Nest_TVB_config.singularity, Nest_TVB_config_2.singularity: file of configurations for singularity
create_container_full.sh, Nest_TVB_config_full.singularity: file for the creation of the image for running a jupyter server with the co-simulation
run_images: example of running co-simulation with the image
- nest-io-dev: the branch of Nest with IO using MPI
- nest_elephant_nest: file which contains all the kernel of the simulation
- Nest: folder contains all the file to configure and launch Nest
- run_mpi_nest.sh: run the simulation Zerlaut with MPI ( use by the orchestrator for launch Nest)
- simulation_Zerlaut.py: run the Nest part of the application
- orchestrator:
- parameters_manager.py: script which manages the parameters ( saving, modify parameters of exploration and link between parameters. )
- run_exploration.py: main script of the simulation for the exploration of 1 or 2 parameters with 1 or 2 simulators
- translation: folder contains the translator between TVB and Nest
- run_...: for running the different component
- nest_to_tvb: for communication between Nest to TVB
- tvb_to_nest: for communication between TVB to Nest
- science_...: files contain the function to transform spike to rate and opposite
- rate_spike: special function use in science based on elephant applications
- test_file: all the tests of translators and Nest I/O
- TVB
- modify TVB: folder contains file for the interface and the extensions of TVB
- Zerlaut.py: model of the Mean Field
- noise.py: specific noise for this model
- Interface_co_simulation.py: interface can be used only in sequential but allow using intermediate result for the firing rate. (Need to compute Nest before TVB.)
- Interface_co_simulation_parallel.py: can be used to compute TVB and Nest in same time
- test_interface...: test for the interface with the model of Wong Wang
- simulation_Zerlaut.py: the script for the configure and the running of the simulator of TVB
- run_mpi_tvb.sh: run the simulation Zerlaut with MPI ( use by the orchestrator for launch TVB )
- test_nest: contains all the test