sbi-dev / sbi

Simulation-based inference toolkit
https://sbi-dev.github.io/sbi/
Apache License 2.0
574 stars 145 forks source link

Unable to fit neuron model with sbi #405

Closed rat-h closed 3 years ago

rat-h commented 3 years ago

Single compartment model with eight cross-membrane currents and calcium dynamics. The model for adult LGN thalamocortical neurons. I'm trying to fit this model to recordings from juvenile animals. Each recording has from 20 to 48 sweeps with different applied currents. The model has to reproduce somatic voltage for all of these currents. Because some cross-membrane currents change conductance density during maturation, I had to open many parameters quite a lot. Overall, SBI should fit 24 parameters.

If combined, data statistics for one recording is a vector with 242 elements : [mean and std of voltage at rest ][number of spike for each sweep][mean std, skewness, and kurtosis for voltage in each stimulation]

Here tarball with the code. To run it one needs to unpack archive, install neurons pip install neuron, and run fitting python sbiFit.py -i P07-04.20205021.npz

It doesn't matter how many initial samples I draw from the prior distribution 50 or 10000 or 50000 (9 hours on 64 cores computer), the result is the same: it computes all these samples and then stuck with the following message:

Neural network successfully converged after 136 epochs.
Drawing 50 posterior samples:   WARNING:root:Only 0% posterior samples are within the
                        prior support. It may take a long time to collect the remaining
                        50 samples. Consider interrupting (Ctrl-C)
                        and switching to sample_with_mcmc=True.

If I enable MCMC python sbiFit.py -i P07-04.20205021.npz -m, it stuck with a different message:

Neural network successfully converged after 135 epochs.
Tuning bracket width...:   0%| 

Any hope to make it work? P.S. GA with MO can handle this pretty well, but I like to have a parameter generator, not a set of parameters. From this point of view, SBI should be very handy, but I failed to run it even once.

janfb commented 3 years ago

Hi @rat-h, thanks for the detailed issue.

The message means that most of mass of the posterior learned by SBI lies outside of the prior bounds. And it is a sign that learning the posterior did not work well. How did you define the prior over each of the 24 parameters?

The 242 dimensions in x space are quite a lot, but should still be OK for SNPE.

As a next debugging step you could try to reduce the dimensionality of the inference problem. You could just fix some of the parameters to reasonable values, e.g., values you obtained with other methods like GA with MO (btw what's the meaning of that?), and then do inference with sbi over, say, 4 parameters that you do not fix.

I hope that helps, Jan

rat-h commented 3 years ago

How did you define the prior over each of the 24 parameters?

So I do not completely understand the question. Do you mean hyperparameters for prior uniform/log-normal distributions? If so, there is no good, solid data for my neurons, and therefore some channels may have different domain combinations and so on. I have to open the parameter space and see what will fell out of the optimization procedure. That isn't an ideal approach because it produces lots of unstable runs, ending with nans. But this is how I can get some parameters for the model which reproduces the real neuron behavior, and then I can assess which of these parameters are realistic.

sbi over, say, 4 parameters that you do not fix

I hope only for debug. Real neuron optimization problem may have hundreds of parameters.

GA with MO (btw what's the meaning of that?)

Genetic algorithm, multiobjective optimization. I use few of them - very popular NSGA2 (non-dominant selection with Pareto archive), also very popular GA with index selection and homemade GA with Krayzam's adaptive weight.

I'll try to run SBI with a few parameters to fit and let you know/

jan-matthis commented 3 years ago

I am closing this issue due to inactivity. @rat-h feel free to reopen it at any point