gallenszl / CFNet

CFNet: Cascade and Fused Cost Volume for Robust Stereo Matching(CVPR2021)
MIT License
155 stars 23 forks source link

作者您好,谢谢你做的工作,我想请问一下为什么只在第二轮训练时使用mish激活函数,第一轮为什么还是relu呢? #40

Open asd9697 opened 1 month ago

gallenszl commented 1 month ago

hello, this strategy is tested by ablation study. You can also fix the activation function to mish. But it will leads to more pre-training time and a little performance degradation. But of course, I believe you can find a better way to train the model with the single activation function. We just not take more time to test it due to it is not the focus of our method.

-----原始邮件----- 发件人:asd131 @.> 发送时间:2024-07-30 10:04:55 (星期二) 收件人: gallenszl/CFNet @.> 抄送: Subscribed @.***> 主题: [gallenszl/CFNet] 作者您好,谢谢你做的工作,我想请问一下为什么只在第二轮训练时使用mish激活函数,第一轮为什么还是relu呢? (Issue #40)

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you are subscribed to this thread.Message ID: @.***>

asd9697 commented 1 month ago

hello, this strategy is tested by ablation study. You can also fix the activation function to mish. But it will leads to more pre-training time and a little performance degradation. But of course, I believe you can find a better way to train the model with the single activation function. We just not take more time to test it due to it is not the focus of our method. -----原始邮件----- 发件人:asd131 @.> 发送时间:2024-07-30 10:04:55 (星期二) 收件人: gallenszl/CFNet @.> 抄送: Subscribed @.> 主题: [gallenszl/CFNet] 作者您好,谢谢你做的工作,我想请问一下为什么只在第二轮训练时使用mish激活函数,第一轮为什么还是relu呢? (Issue #40) — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you are subscribed to this thread.Message ID: @.>

谢谢您的解答,我尝试了mish和relu,第二轮使用mish的确会使sceneflow的epe降低,但将生成的视差图上传到kitti,得到的指标没有使用relu生成的好,请问您得出过这个结论吗

gallenszl commented 1 month ago

Our conclusion is using mish will be better

-----原始邮件----- 发件人:asd131 @.> 发送时间:2024-07-31 15:32:56 (星期三) 收件人: gallenszl/CFNet @.> 抄送: gallenszl @.>, Comment @.> 主题: Re: [gallenszl/CFNet] 作者您好,谢谢你做的工作,我想请问一下为什么只在第二轮训练时使用mish激活函数,第一轮为什么还是relu呢? (Issue #40)

hello, this strategy is tested by ablation study. You can also fix the activation function to mish. But it will leads to more pre-training time and a little performance degradation. But of course, I believe you can find a better way to train the model with the single activation function. We just not take more time to test it due to it is not the focus of our method. … -----原始邮件----- 发件人:asd131 @.> 发送时间:2024-07-30 10:04:55 (星期二) 收件人: gallenszl/CFNet @.> 抄送: Subscribed @.> 主题: [gallenszl/CFNet] 作者您好,谢谢你做的工作,我想请问一下为什么只在第二轮训练时使用mish激活函数,第一轮为什么还是relu呢? (Issue #40) — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you are subscribed to this thread.Message ID: @.>

谢谢您的解答,我尝试了mish和relu,第二轮使用mish的确会使sceneflow的epe降低,但将生成的视差图上传到kitti,得到的指标没有使用relu生成的好,请问您得出过这个结论吗

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>