sthalles / SimCLR

PyTorch implementation of SimCLR: A Simple Framework for Contrastive Learning of Visual Representations
https://sthalles.github.io/simple-self-supervised-learning/
MIT License
2.19k stars 458 forks source link

size of tensors in cosine_simiarity function #19

Closed Wonder1905 closed 3 years ago

Wonder1905 commented 3 years ago

Hi , I'm trying to understand the code in : loss/nt_xent.py

we are sending "representations" on both arguments

    def forward(self, zis, zjs):
        representations = torch.cat([zjs, zis], dim=0)
        similarity_matrix = self.similarity_function(representations, representations)

But when receiving it in cosine_similarity func somehow the sizes are: (N, 1, C) and y shape: (1, 2N, C), how can it be double if you sent the same argument

    def _cosine_simililarity(self, x, y):
        # x shape: (N, 1, C)
        # y shape: (1, 2N, C)
        # v shape: (N, 2N)
        v = self._cosine_similarity(x.unsqueeze(1), y.unsqueeze(0))
        return v

Thanks for your help.

vanIvan commented 3 years ago

@BattashB I don't understand it either

guanyadong commented 3 years ago

I don't understand it either

YinAoXiong commented 3 years ago

I have the same question

YinAoXiong commented 3 years ago

@BattashB I think this should be just a small mistake, the correct comment should be the following,same for the _dot_simililarity function

    def _cosine_simililarity(self, x, y):
        # x shape: (2N, 1, C)
        # y shape: (1, 2N, C)
        # v shape: (2N, 2N)
        v = self._cosine_similarity(x.unsqueeze(1), y.unsqueeze(0))
        return v
sthalles commented 3 years ago

Hello guys, that is a general message to say that I have refactored the whole project. I believe the project is much easier to understand now. Please have a look at the new impl and free to submit PR if you find any bugs. Thanks.