A deep learning library for spiking neural networks which is based on PyTorch, focuses on fast training and supports inference on neuromorphic hardware.
I cannot find a case where this would fail. Unit tests have been changed accordingly.
A comment can be made on the root-cause of the issue is that SpikeLayer constructors do not share the same signature. I believe this was intentional and having the non-shared parameters in a dictionary covers all the cases, as long as the users do not use it in a wrong way. I also do not have a suggestion on how to address this without extensive discussion.
This does not concern the Quality control.
Changes are automatically reflected in the documentation.