digantamisra98 / Mish

Official Repository for "Mish: A Self Regularized Non-Monotonic Neural Activation Function" [BMVC 2020]
https://www.bmvc2020-conference.com/assets/papers/0928.pdf
MIT License
1.29k stars 130 forks source link

how to use mish in tensorflow slim.conv2d???? #21

Closed henbucuoshanghai closed 4 years ago

henbucuoshanghai commented 4 years ago

slim.conv2d it is relu ,how to change it to mish?

digantamisra98 commented 4 years ago

@henbucuoshanghai Hi. I'm not sure how to integrate Mish into TF slim. I'll have to give a check. Meanwhile, you can try integrating TF-Addon's Mish into the TF Slim architecture. Give me some time and I'll get back on this issue. Link to TFA implementation - https://github.com/tensorflow/addons/tree/master/tensorflow_addons/activations#contents

henbucuoshanghai commented 4 years ago

very Kind of you

digantamisra98 commented 4 years ago

@seanpmorgan @WindQAQ

seanpmorgan commented 4 years ago

Hi! So TF-Addons is for TF2.x and thus there is no support for the slim library (That was part of tf.contrib and is no longer available in TF2).

I think the easiest way to implement it in your existing architecture (given that you have to use slim) would be to write a simple python ops version of mish that you could use:

def mish(inputs):
    return inputs * tf.math.tanh(tf.math.softplus(inputs))

TF-Addons uses custom-op kernels for CPU/GPU that would give you a noticeable performance increase, but that is non-trival to incorporate into your code base (likely not worth the effort, but depends on your use case.)

digantamisra98 commented 4 years ago

@henbucuoshanghai hopefully, this solved your issue. Closing this as of now. Feell free to re-open.