Closed GorkaAbad closed 1 year ago
Hi,I wll add the deconv layers.
Is it preferred to use ANN2SNN?
I think you can use it as long as your ANN use ReLU activation.
We have added ConvTranspose
layers:
Thanks for the fast reply. I tried it and works perfectly.
Thanks again
Hi, I want to create a (Spiking) autoencoder. However, I checked, and there are no Deconvolutional layers in
spikingjelly/activation_based/layer.py
. Can we create an autoencoder with LIF neurons from scratch? Is it preferred to use ANN2SNN? I assume the last won't work because it requires spiking deconv layers.Do you have any suggestions? Thanks in advance Gorka