Closed EmbraceLife closed 7 years ago
In the PyTorch section, change this:
func = {
...
'leakyrelu' : F.leaky_relu
...
}.get(self.type.lower())
to this:
func = {
...
'leakyrelu' : (lambda x: F.leaky_relu(x, negative_slope=self.alpha))
...
}.get(self.type.lower())
Side-note: I would also suggest checking that alpha
is not defined when self.type
is not leakyrelu
.
I have managed to add
LeakyReLU
for both keras and pytorch end (see code below). And I want to add an argumentalpha
in keras ornegative_slope
in pytorch (equivalent, I guess) to this activation.I could add
alpha
to keras LeakyReLU, but failed to add it to pytorch. I wonder when user needs to set value foralpha
ornegative_slope
, how can they do it if we don't make the argument accessible. That's why I want to make the argument available in kur. Or is it that most cases we don't need to changealpha
ornegative_slope
so there is no point of adding them?Below is the source code I have changed to make
LeakyReLU
andalpha
possible, but notnegative_slope
for pytorch. Could you check it for me and shed some light on how to addnegative_slope
for pytorch? Thanks