GuoLanqing / ShadowFormer

ShadowFormer (AAAI2023), Pytorch implementation
MIT License
129 stars 17 forks source link

关于损失函数和torch.clamp. #20

Closed LoveU3tHousand2 closed 1 year ago

LoveU3tHousand2 commented 1 year ago

我注意到论文中是用的L1可是代码中是CharbonnierLoss 想问一下到底用哪个是比较好的。

再一个是我注意到训练代码中是把输出clamp到(0, 1)之后才进行损失计算的,这个会有很大影响吗 如果不clamp的话。

LoveU3tHousand2 commented 1 year ago

还有Mixup... 这个有帮助吗?

GuoLanqing commented 1 year ago

This will only slightly affect the results. Thanks for your interest.