This isn't a bug - just a scientific / implementation question.
In the Leaky and Synaptic class (and probably other places), alpha and beta are constrained to be between 0 and 1 by torch.clamp(beta, 0, 1). This means that they both could grow outside the range from learning (or by the user setting a higher value without realizing), but that value won't actually be used.
Any reason not to use torch.sigmoid() as a wrapper on beta so it can have values from $-\infty$ to $\infty$?
This would make reading out self.beta less informative, but of course you could just write:
This isn't a bug - just a scientific / implementation question.
In the Leaky and Synaptic class (and probably other places), alpha and beta are constrained to be between 0 and 1 by
torch.clamp(beta, 0, 1)
. This means that they both could grow outside the range from learning (or by the user setting a higher value without realizing), but that value won't actually be used.Any reason not to use
torch.sigmoid()
as a wrapper on beta so it can have values from $-\infty$ to $\infty$?This would make reading out self.beta less informative, but of course you could just write: