Open John1231983 opened 3 years ago
It support fp16.
Yes, I saw that fp16 is in there now and forgot to close my PR. Thanks for heads up.
Are you using Apex instead of AMP in pytorch? The method of @xsacha allows us to use AMP, but we should modify something in metric.py as mention. @xsacha could you update the PR, it is useful for many people, included me
Hi all, I am using pytorch 1.16 and it supported FP16. Could your accept the PR ? https://github.com/cavalleria/cavaface.pytorch/pull/32/files @xsacha please update head file. Thanks