Closed shreshth-29 closed 11 months ago
I might be missing something, but I don't understand how you define weights in this context. What does 'filter' represent? And 'membrane', 'spiketrains', 'spiketimes' should not be considered as (learnable / fixed) parameters; they are variables which are updated at runtime depending on the dynamic network input.
So I am working with the SNN Toolbox, and I managed to convert a CNN into an SNN and I saved it as a .h5 file. Because it now has layers such as 'SpikeConv2D' etc which are not typically recognized by keras, I registered them as custom objects which worked perfectly. I tried to use the load_model function to load this model, and it worked. The SpikeConv layer has 6 kinds of weights: filter, bias, dt, threshold, membrane and spiketrains. I was able to analyze everything perfectly. The encoding was 'temporal_mean_rate'.
Now, I changed the encoding to
ttfs_base
. This new model is now supposed to have 8 weights in the SpikeConv layer, (spiketimes and something else in addition to the 6 earlier). When I now try and load the model, I get a value error:weight count mismatch, expected 6 but got 8 weights
. The keras source code raises an error whenever the length of the symbolic weights (6 in this case) is not equal to the length of the weights received (8). I have not explicitly set the length of symbolic weights to be 6 anywhere, while registering this custom object.Any ways to fix this?