Open jajcayn opened 3 years ago
Yessss I was waiting for this issue ❤️ Some comments on your comments:
- code for computing the cascade is already part of neurolib in models/aln/aln-precalc/precompute_quantitites/ so I'd refactor a bit and move it somewhere else
- the cascade computation should work as a single function with arguments as per single neuron parameters, i.e. user enters necessary neuronal parameters into the function and the function does everything and saves the cascade
It would be great to have a nicer wrapper around the procompute functions, I agree. It should ideally stay part of the same package I think, since it's not really part of the core of neurolib (rather "just" one of the models, namely aln
).
- I'd create something like BaseMeanFieldNeuralMass on par with base for NeuralMass in MultiModel.. it'd contain the dynamics based on number of populations, etc
I don't get this, can you explain? What does the computation have to do with MultiModel
? In any case, we should first work on the native ALNModel
if we extend it and then port it to the MultiModel implementations of it. Right now ALNModel
already accepts custom transfer functions, so idk exactly what this is referring to.
- some functions will be necessary to compute the mean-field parameters (the Ks, Js, cs, etc)... these would live under the same subpackage as the code for cascade computation and would save params as jsons next to the cascade - jsons, because default model parameters in MultiModel are dicts and json is practically a dict...
I'm a bit skeptical on this point. K's and J's do not affect the computation of the quantities, that's why they don't need to be remembered. I would discourage saving additional files (parameters) in another format (json
) if unnecessary. The quantities.hdf
table should contain all information.
finally a final model would be created by subclassing the BaseMeanFieldNeuralMass into a new mean-field model.. user would define the name of the model, some basic attributes, like default noise input, etc. but the dynamics would be the same as ALN, the parameters would be loaded from json based on computation and the linear-nonlinear cascade would be loaded based on the computation
How is this different from the existing ALN Model with different quantities? A BaseMeanFieldNeuralMass
sounds very general, there are many different kinds of mean field models out there. As you know, the one that we are using is a LN cascade model, that's why every model with this cascade will be just a version of the ALN Model.
My thoughts:
I don't get this, can you explain? What does the computation have to do with
MultiModel
? In any case, we should first work on the nativeALNModel
if we extend it and then port it to the MultiModel implementations of it. Right nowALNModel
already accepts custom transfer functions, so idk exactly what this is referring to.
My initial thought was to make this under the MultiModel
framework. The idea was in the native ALN everything is hardcoded. Yes, it can accept different lin-nonlin cascade, and different parameters, but refactoring such that you also set a number of populations within one node (e.g. the hippocampal MF model has 3) would be hard... Within the MultiModel
framework I already have ideas on how to do it. Anyway, it'd be nice to have this also in the native ALN model, I agree. I will think more about how to do it. (The problem is - the integration is hardcoded and jitted with numba
with 2 populations... At this point I do not see an easy workaround to allow N
populations within one node)
I'm a bit skeptical on this point. K's and J's do not affect the computation of the quantities, that's why they don't need to be remembered. I would discourage saving additional files (parameters) in another format (
json
) if unnecessary. Thequantities.hdf
table should contain all information.
well yes: the table contains the transfer function for r
, V
and tau
.. However, other model parameters such as K
, J
etc also depend on spiking network parameters. So yes - user can set this up using model.params
, but they don't know how :) The functions would be simply to compute parameters as J
, K
etc from parameters of spiking network model. Saving to json is probably an overkill, but the idea is: user interest all necessary parameters of their spiking network model and the pipeline figures out not only the cascade, but also other mean-field parameters like c
, J
and K
. Moreover, e.g. je c
and J
are more-or-less trivial when your spiking network model has current-based synapses. But in the case of conductance-based synapses, you actually need to compute c
as a the maximum PSC
How is this different from the existing ALN Model with different quantities? A
BaseMeanFieldNeuralMass
sounds very general, there are many different kinds of mean field models out there. As you know, the one that we are using is a LN cascade model, that's why every model with this cascade will be just a version of the ALN Model.
yes, but see my comments up: problems are: number of populations within one ALN node; functions for computing ALN parameters as c
, J
, K
etc...
- We should avoid adding too much complexity at once.
yes! agreed! I'd definitely divide this work into more PRs
- Right now it seems like the ALNModel can do all this (and should be actually the model that does it all, not only with one set of parameters but with any set).
yes and no. Yes in the sense that you can just compute different cascades and work out the parameters of ALN. But it is necessary to include an option for more than 2 populations within one node inside the ALN's timeIntegration.py
- I think the focusing on extending the ALNModel can be more fruitful. What seems to be especially missing is having multiple populations with different transfer functions. Is this what you are referring to mostly?
yes, two main points: extending existing ALN to include more populations, based on the user's spiking network model, plus having functions to derive ALN parameters (not necessary saving to json) from spiking network parameters at hand
The idea was in the native ALN everything is hardcoded.
So you want to build a modular ALNModel, is that the goal? With a varying number of populations? I thought that's exactly what MultiModel is for, I think I don't get it. You could simply create multiple ALNMass models and connect them the usual MultiModel way, but with support for multiple transfer functions.
yes, but see my comments up: problems are: number of populations within one ALN node; functions for computing ALN parameters as c, J, K etc...
The parameters c, J, and K are the same as in a spiking network, they don't need to be computed from anything. And they do not affect the transfer functions. The transfer function is computed from a single neuron. You're referring to "deriving mean field model for a spiking network". I think it's more precise to say that the ALNModel is an approximation of the AdEx population with synapses. We can't approximate any spiking network with the ALNModel.
I don't think it makes a lot of sense to automate every step. The use case is just too special. If someone wants to make a new model, they should exactly know what they are doing and build the model accordingly. If we just allow any parameter configuration to be translated into an ALNModel, we would have to do a lot of testing. We don't even know in which parameter regimes the ALNModel is valid or not, it's all been tested only in one or two regimes.
So all in all, I would suggest the following, just to save energy and time. Obviously this is just a suggestion:
The rest should be the researchers job. Including verifying if the mean-field approximation is valid/working at all, which would take immense amount of time for each constructed model and probably result in a research paper.
I agree with all points.
My initial thoughts were in the direction of:
delta_T
(the exponential parameter) to zero in the computation of the cascade. I cannot think of any reason why that would not work. <number of presyn. neurons> * <probability of connection> * <unit increase of conductance> * (E_rev - V) / <capacitance>
... in other words, you can work out parameters even when your original spiking network is not current-based but rather conductance-based.But you are probably right, that you cannot approximate every spiking network with ALN model just using different parameters and different cascades and optionally not using adaptation.
I recently worked out the mean-field approximation of the spiking model of the hippocampus, more-or-less successfully. This process can be automated to a large extent.
The idea is to allow users to go from their spiking networks with multiple populations (LIF/EIF/AdEx neurons + conductance-/current-based synapses) to mean-field approximation in the light of ALN. The actual equations for the mean-field model are the same (slight adjustments w.r.t number of population), the only difference would be the precomputed linear-nonlinear cascade and the model parameters. The cascade computation depends on single neuron parameters in the spiking model and the parameters of the mean-field approximation depend on both the synaptic parameters and the network parameters (number of neurons, probability of connection, etc.).
Some things to consider:
models/aln/aln-precalc/precompute_quantitites/
so I'd refactor a bit and move it somewhere elseBaseMeanFieldNeuralMass
on par with base forNeuralMass
inMultiModel
.. it'd contain the dynamics based on number of populations, etcK
s,J
s,c
s, etc)... these would live under the same subpackage as the code for cascade computation and would save params as jsons next to the cascade - jsons, because default model parameters in MultiModel are dicts and json is practically a dict...BaseMeanFieldNeuralMass
into a new mean-field model.. user would define the name of the model, some basic attributes, like default noise input, etc. but the dynamics would be the same as ALN, the parameters would be loaded from json based on computation and the linear-nonlinear cascade would be loaded based on the computationIdeas? Critique?