HobbitLong / SupContrast

PyTorch implementation of "Supervised Contrastive Learning" (and SimCLR incidentally)
BSD 2-Clause "Simplified" License
2.98k stars 525 forks source link

Supervised Contrastive Learning with n_views=1 #122

Open piconti opened 1 year ago

piconti commented 1 year ago

Hello, First, thank you for this paper and code! I want to adapt you approach to use Supervised Contrastive Learning after a regional proposal network.

In this context, I was wondering, as it's not explicitly said in the paper: what are your exact motivations for including two views of the images in each batch? From my understanding/intuition, it's to ensure that the anchor is exposed to a "decent"/minimum number of positive samples, is that right?

Thank you!

Breeze-zyuhan commented 1 year ago

My understanding is that to make sure that there exist positives in the multiviewed batch, otherwise, the numerator in the loss formula (Eq. 2/3) would be 0.