NeuromorphicProcessorProject / snn_toolbox

Toolbox for converting analog to spiking neural networks (ANN to SNN), and running them in a spiking neuron simulator.
MIT License
360 stars 104 forks source link

TTTFS dyn thresh and TTFS corrective not working #140

Closed githubofaliyev closed 9 months ago

githubofaliyev commented 11 months ago

When I run with all other spike coding, it runs fine no issue. But these two spike coding issue throws the following error. I tried to trace the error back to the backend source files but still couldn work it out. I do use the dev version. Thanks


config file [paths] path_wd = /Users/ilkinaliyev/Desktop/snntoolbox/snn_toolbox-master/temp/mnist_cnn dataset_path = /Users/ilkinaliyev/Desktop/snntoolbox/snn_toolbox-master/temp/mnist_cnn filename_ann = mnist_cnn

[conversion] spike_code = ttfs_corrective

[tools] evaluate_ann = True normalize = True

[simulation] simulator = INI num_to_test = 10000 batch_size = 5000 keras_backend = tensorflow duration = 100

[output] verbose = 0 plot_vars = {'spikerates', 'v_mem', 'spiketrains', 'correlation', 'input_image', 'operations', 'activations', 'normalization_activations', 'error_t', 'spikecounts'} overwrite = True

** ERRROR. ***** Number of operations of ANN: 79978762 Number of neurons: 108106 Number of synapses: 48185088

Saving model to /Users/ilkinaliyev/Desktop/snntoolbox/snn_toolbox-master/temp/mnist_cnn/mnist_cnn_INI.h5...

Traceback (most recent call last): File "/Users/ilkinaliyev/Desktop/snntoolbox/snn_toolbox-master/examples/resnet_keras_cifar_INI.py", line 176, in main(config_filepath) File "/Users/ilkinaliyev/miniconda3/lib/python3.11/site-packages/snntoolbox/bin/run.py", line 31, in main run_pipeline(config) File "/Users/ilkinaliyev/miniconda3/lib/python3.11/site-packages/snntoolbox/bin/utils.py", line 131, in run_pipeline spiking_model.save(config.get('paths', 'path_wd'), File "/Users/ilkinaliyev/miniconda3/lib/python3.11/site-packages/snntoolbox/simulation/target_simulators/INI_temporal_mean_rate_target_sim.py", line 255, in save self.snn.save(filepath, self.config.getboolean('output', 'overwrite')) File "/Users/ilkinaliyev/miniconda3/lib/python3.11/site-packages/keras/src/utils/traceback_utils.py", line 70, in error_handler raise e.with_traceback(filtered_tb) from None File "/Users/ilkinaliyev/miniconda3/lib/python3.11/site-packages/h5py/_hl/group.py", line 183, in create_dataset dsid = dataset.make_new_dset(group, shape, dtype, data, name, **kwds) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/ilkinaliyev/miniconda3/lib/python3.11/site-packages/h5py/_hl/dataset.py", line 163, in make_new_dset dset_id = h5d.create(parent.id, name, tid, sid, dcpl=dcpl, dapl=dapl) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper File "h5py/h5d.pyx", line 138, in h5py.h5d.create

githubofaliyev commented 11 months ago

@rbodo have you faced this issue before? I don't what else I should try to find the root cause for this

rbodo commented 11 months ago

Hi, I haven't seen this error before. And tbh I don't really see how it fails - the error message from the h5py library does not seem to offer any explanation. Was there nothing else in the output?

You said you are using the dev version, but curiously the line numbers don't seem to match: The stack trace points to line 255 for the save method, but in the repo it's on line 253. Might be nothing, but can you confirm there has been no relevant modification to the source code?

As to the actual problem: Since we are calling the standard keras.save method here, I would guess that the keras version you have installed is incompatible with the custom SNN model properties which we are trying to save. If that is indeed the reason, ideally one would try to figure out which ones are causing the issue and bring them up to date with keras. Alternatively, you could try downgrading keras (see here for an old version list, perhaps you could start from there).