Verified-Intelligence / alpha-beta-CROWN

alpha-beta-CROWN: An Efficient, Scalable and GPU Accelerated Neural Network Verifier (winner of VNN-COMP 2021, 2022, 2023, and 2024)
Other
243 stars 60 forks source link

Support SiLU activation function #20

Open cong-liu-2000 opened 1 year ago

cong-liu-2000 commented 1 year ago

SiLU is a popular activation function. It is used in the YOLOv5 networks. Can you support it?

huanzhang12 commented 1 year ago

Thanks for reporting this to us. We will consider supporting this activation function in a future version, but it will definitely take some time.

@shizhouxing I think you can consider SiLU activations and other similar activation functions (Softplus, ELU, etc) in benchmarks for non-linear functions. They are indeed very useful non-linearities in many applications. Sometimes they perform better than ReLU, and sometimes people have theoretical requirements that smooth functions are needed, so ReLU cannot be used.