GuoLanqing / ShadowFormer

ShadowFormer (AAAI2023), Pytorch implementation
MIT License
129 stars 17 forks source link

Why are the weights so big? ISTD_model_latest.pth (131M) #14

Closed igodogi closed 1 year ago

igodogi commented 1 year ago

According to the paper table, the parameters number of ShadowFormer is only 2.4M. Why is the weight file provided so large?

GuoLanqing commented 1 year ago

ShadowFormer has two versions. I only released the large one first. And the released version would be slightly larger than the Ours-Large, but the performance would also be better than the results reported in the paper.

I will release the whole version as soon as possible. Thanks for your interest.

GuoLanqing commented 1 year ago

The released version has around 11M parameters.

igodogi commented 1 year ago

Still confused. If saved as FP32 format, the weights may be 11M*4=44MB. Why 131MB?

GuoLanqing commented 1 year ago

You can try to calculate the number of model_restoration.parameters().


发件人: igodogi @.> 发送时间: 2023年3月13日 16:19 收件人: GuoLanqing/ShadowFormer @.> 抄送: #GUO LANQING# @.>; Comment @.> 主题: Re: [GuoLanqing/ShadowFormer] Why are the weights so big? ISTD_model_latest.pth (131M) (Issue #14)

Still confused. If saved as FP32 format, the weights may be 11M*4=44MB. Why 131MB?

― Reply to this email directly, view it on GitHubhttps://github.com/GuoLanqing/ShadowFormer/issues/14#issuecomment-1465695088, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AJOVWJ2JX35UWETDTRJLOHLW33KBFANCNFSM6AAAAAAVYQFSDY. You are receiving this because you commented.Message ID: @.***>

igodogi commented 1 year ago

OK. I find your *.pth inculding the 'optimizer', so it is two times bigger. Actually it is 44MB just for weights.