simetenn / uncertainpy

Uncertainpy: a Python toolbox for uncertainty quantification and sensitivity analysis, tailored towards computational neuroscience.
http://uncertainpy.readthedocs.io
GNU General Public License v3.0
220 stars 50 forks source link

Memory Error #17

Closed maciej-jedynak closed 6 years ago

maciej-jedynak commented 6 years ago

At the stage of "Calculating statistics from PCE:" I get a MemoryError, even though I specially extended swap by 8Gb, thus I suppose some very heavy operations are being performed. The network of spiking neurons and simulation times were relatively low (100 neurons, 1000 time steps).

File "ntwk_sim_demo_un_NetworkFeatures.py", line 314, in data = UQ.quantify(save=False) File "/home/maciek/anaconda3/envs/HH_MF/src/uncertainpy-master/src/uncertainpy/uncertainty.py", line 372, in quantify custom_kwargs) File "/home/maciek/anaconda3/envs/HH_MF/src/uncertainpy-master/src/uncertainpy/uncertainty.py", line 636, in polynomial_chaos custom_kwargs File "/home/maciek/anaconda3/envs/HH_MF/src/uncertainpy-master/src/uncertainpy/core/uncertainty_calculations.py", line 1248, in polynomial_chaos data = self.analyse_PCE(U_hat, distribution, data, nr_samples=nr_pc_mc_samples) File "/home/maciek/anaconda3/envs/HH_MF/src/uncertainpy-master/src/uncertainpy/core/uncertainty_calculations.py", line 924, in analyse_PCE U_mc[feature] = U_hatfeature File "/home/maciek/anaconda3/envs/HH_MF/lib/python2.7/site-packages/chaospy/poly/base.py", line 160, in call return chaospy.poly.caller.call(self, args) File "/home/maciek/anaconda3/envs/HH_MF/lib/python2.7/site-packages/chaospy/poly/caller.py", line 69, in call val = np.where(val != val, 0, val) MemoryError

simetenn commented 6 years ago

Yes, there are unfortunately some quite heavy calculations being performed in the background in this step. Are you recording from all 100 neurons (getting a spiketrain)? For the example brunel network in the documentation I record from 20 neurons over 1000 ms, and this require somewhere around ~45 GB of memory.

maciej-jedynak commented 6 years ago

I managed to record from 200 neurons, out of the network of 1000 Hodgin Huxley neurons, simulated for 1000 ms. In order to do that I extended swap to 40 GB.

simetenn commented 6 years ago

Extending the swap is a nice workaround to know of.