LeapLabTHU / MLLA

Official repository of MLLA
127 stars 4 forks source link

the ELU activation function #19

Open Journey7331 opened 3 days ago

Journey7331 commented 3 days ago

Hi, @tian-qing001 Thanks for your great work!

Is there any ablation study on the activation function? And why q + 1.0 after applying elu? ensure q > 0 for better training? 👀

https://github.com/LeapLabTHU/MLLA/blob/5a1719682cb040ec708fb633ad1379c9eab28576/models/mlla.py#L129-L130