Open kimborgen opened 1 year ago
Ah, torch.dropout scales the spikes! https://pytorch.org/docs/stable/generated/torch.nn.Dropout.html
Because scaling is required for ANN but does not work for spikes, lets remove the dropout layers for now and research spike specific dropout methods.
The dropout layers cause the spikes (1) to scale to 1.42... Que? When I remove the dropout its ok spk_conv1 = self.dropout1(spk_conv1)