HobbitLong / SupContrast

PyTorch implementation of "Supervised Contrastive Learning" (and SimCLR incidentally)
BSD 2-Clause "Simplified" License
3.04k stars 525 forks source link

SupContrast with Moco trick #30

Open richcmwang opened 4 years ago

richcmwang commented 4 years ago

Hi, thank you for sharing the nice work. I notice the result of SupContrast with Moco trick on ImageNet 79.1. Do you have the plan to push the code here? It would be helpful to see that. Thanks!
Screenshot from 2020-08-09 16-46-56

jlindsey15 commented 4 years ago

Hi! Just wanted to second this, it would be great to have the code and a pre-trained model for the SupContrast network on ImageNet.

yassouali commented 3 years ago

Hi, I agree with the above.

About MoCo trick, I am curious on how it is implemented, in the original paper, the elements of momentum dict are only used as negatives. In this case, are the elements used for both the positives if they have the same label and as negatives? and is there any difference in the usage of the current results (current batch) and the momentum ones.

Thanks.

seefun commented 3 years ago

Are there plans to release the code of Supcon with MoCo trick ?

usama13o commented 2 years ago

I think It would great if we can get the implementation for the code using the memory trick to alleviate those who have GPU memory constraints.

songyonger commented 1 year ago

@yassouali I have the same problem as you, have you figured out how to achieve it now?