Closed Frightera closed 1 year ago
TensorFlow and Pytorch have it implemented, JAX does not have it though. Is it still worth adding this activation function? Should be a one-liner in JAX too.
It has 1405 citations: Mish: A self regularized non-monotonic activation function
We already have it in keras.activations -- no clear need to have it in ops.nn IMO.
keras.activations
ops.nn
TensorFlow and Pytorch have it implemented, JAX does not have it though. Is it still worth adding this activation function? Should be a one-liner in JAX too.
It has 1405 citations: Mish: A self regularized non-monotonic activation function