jeshraghian / snntorch

Deep and online learning with spiking neural networks in Python
https://snntorch.readthedocs.io/en/latest/
MIT License
1.28k stars 217 forks source link

Question about coding choice: why are beta and alpha clamped? #346

Open landoskape opened 1 month ago

landoskape commented 1 month ago

This isn't a bug - just a scientific / implementation question.

In the Leaky and Synaptic class (and probably other places), alpha and beta are constrained to be between 0 and 1 by torch.clamp(beta, 0, 1). This means that they both could grow outside the range from learning (or by the user setting a higher value without realizing), but that value won't actually be used.

Any reason not to use torch.sigmoid() as a wrapper on beta so it can have values from $-\infty$ to $\infty$?

This would make reading out self.beta less informative, but of course you could just write:

def get_beta(self):
    return torch.sigmoid(self.beta)