Closed EnricoTrizio closed 2 months ago
I think that works but it'll probably end up creating more lines of code than those you'd save by handling it explicitly as usual ('linear'
here means really None
). I don't think that function is called in many places anyway. What if we remove that case and just call that function as
if activ != 'linear':
get_activation(activ)
else:
logger.warn('No activation function')
Sorry, I realized this because for the committor one has an activation on the last layer, but it's also useful to turn it off sometimes. I can move it to the post-processing part in case.
I was thinking that if someone chooses linear
it would also be nice to have it working.
In the list of activation functions there is also
linear
that doesn't do anything except for printing a warning.So if used (which is something that makes sense), activ is initialized as None, giving errors.
Maybe we can create a fake activation function that doesn't do anything but still is a
torch.nn.module
Also, the list of available activations must be updated.