AllenInstitute / bmtk

Brain Modeling Toolkit
https://alleninstitute.github.io/bmtk/
BSD 3-Clause "New" or "Revised" License
261 stars 84 forks source link

Record GLIF model after spike current PointNet #365

Open Jballbe opened 2 months ago

Jballbe commented 2 months ago

Hi,

I am currently using a GLIF3 model glif_psc – Current-based generalized leaky integrate-and-fire (GLIF) models (from the Allen Institute), and I was wondering if there was a way to record the after spike currents during the simulation?

Thank you for any help you can give me! Best,

Julien Ballbé

kaeldai commented 2 months ago

Hi @Jballbe

I was able to look into it and it looks like for the glif_psc models there is a variable called "ASCurrents_sum" that can be recorded at each step in the simulation. It looks like it is just the sum of both "asc_amps" in pA. (The individual "asc_amps" are stored in a vector of size 2 and I don't think NEST allows multimeter recordings of vectors).

To record the "ASCurrents_sum" variable in BMTK you can add a "multimeter_report" to the reports section of the config.simualation.json file. For instance to record in the point_450glifs example I'd add the following to record from the Scnn1a cells:

"reports": {
    "ascurrents_recordings": {
      "module": "multimeter_report"
      "variable_name": "ASCurrents_sum",
      "cells": {
        "population": "v1",
        "model_template": "nest:glif_psc",
        "model_name": "Scnn1a"
      },
    }
  },

You can modify the cell attributes filter to record for a different population (or even all glif_psc cells). After the simulation runs it will create a file output/ascurrents_recordings.h5 with a Timestamps x Cells "data" table.

Just a Warning: The multi-meter recordings files can get really big and also slow down the simulation significantly. Especially if you're not running on an HPC it may be good to either filter the cells, or to reduce the tstop or dt parameters in the config before you know how big these files can get!

Jballbe commented 2 months ago

Hi @kaeldai ,

Thank you very much for your response, that's very helpful. And thank you for the warning

Best, Julien