Open christianechevarria opened 4 years ago
This suggestion was made by @digantamisra98 and ported over from Synaptic https://github.com/cazala/synaptic/issues/344
Mish is a new novel activation function proposed in this paper. It has shown promising results so far and has been adopted in several packages including: TensorFlow-Addons SpaCy (Tok2Vec Layer) Thinc - SpaCy's official NLP based ML library Echo AI Eclipse's deeplearning4j Hasktorch CNTKX - Extension of Microsoft's CNTK FastAI-Dev Darknet Yolov3 BeeDNN - Library in C++ Gen-EfficientNet-PyTorch dnet All benchmarks, analysis and links to official package implementations can be found in this repository It would be nice to have Mish as an option within the activation function group. This is the comparison of Mish with other conventional activation functions in a SEResNet-50 for CIFAR-10: (Better accuracy and faster than GELU)
Mish is a new novel activation function proposed in this paper. It has shown promising results so far and has been adopted in several packages including:
All benchmarks, analysis and links to official package implementations can be found in this repository
It would be nice to have Mish as an option within the activation function group.
This is the comparison of Mish with other conventional activation functions in a SEResNet-50 for CIFAR-10: (Better accuracy and faster than GELU)
This seems very promising. We should definitely add this to our collection of activation functions.
Description
This suggestion was made by @digantamisra98 and ported over from Synaptic https://github.com/cazala/synaptic/issues/344
This seems very promising. We should definitely add this to our collection of activation functions.