Open cong-liu-2000 opened 1 year ago
Thanks for reporting this to us. We will consider supporting this activation function in a future version, but it will definitely take some time.
@shizhouxing I think you can consider SiLU activations and other similar activation functions (Softplus, ELU, etc) in benchmarks for non-linear functions. They are indeed very useful non-linearities in many applications. Sometimes they perform better than ReLU, and sometimes people have theoretical requirements that smooth functions are needed, so ReLU cannot be used.
SiLU is a popular activation function. It is used in the YOLOv5 networks. Can you support it?