Closed wh1t3tea closed 7 months ago
Have you tried with num_classes=0 instead?
I don't think related to num_classes this model embeddings is 8192 if you want before classifier embeddings can add below code, and do any pooling technique you prefer
model.classifier = nn.Identity()
which will output 960 and not 512
Thank you @khursani8. @wh1t3tea please confirm if his method fix your problem
Yeah, it works fine, but i have a question about .mean(-1), is it correct to squeeze dims like that? I'm begginer and see this for the first time.
Yeah, it works fine, but i have a question about .mean(-1), is it correct to squeeze dims like that? I'm begginer and see this for the first time.
Yes and No, Reason for squeeze like that because it not pretrained weight and just random weight, so just for example only Usually people use pooling https://www.kaggle.com/code/debarshichanda/pytorch-arcface-gem-pooling-starter
Or can read more on retrieval paper
Thanks a lot! now it looks reasonable for me
@wh1t3tea i have reread the paper and go through the paper code again. i've fixed the implementation and will release it in the next 12 hours
@Hazqeel09, It's would be great! haven't found any other pytorch implementations of this model, so you are doing great job. Thanks!
Now it works good
Ohh sorry @wh1t3tea , i did not read the paper and assume the implementation correct. Since I usually use embeddings before classifier
it's okay @khursani8, the mistake was on my side. at that time, i'm still trying to do these implementations sksksk
thank you for trying to help and joining the discussion
@wh1t3tea i've released 1.4.7 to fix the implementation, kindly try the latest version
I have problems with using GhostFaceNetsV2 as feature extractor. As I see it doesn't provide 512 vector if use num_classes=None. Can you help me with that?