harveyslash / Facial-Similarity-with-Siamese-Networks-in-Pytorch

Implementing Siamese networks with a contrastive loss for similarity learning
https://hackernoon.com/one-shot-learning-with-siamese-networks-in-pytorch-8ddaab10340e
MIT License
972 stars 274 forks source link

Loss Function #28

Closed ghost closed 4 years ago

ghost commented 5 years ago

Instead of

def forward(self, output1, output2, label):
        euclidean_distance = F.pairwise_distance(output1, output2)
        loss_contrastive = torch.mean((1-label) * torch.pow(euclidean_distance, 2) +
                                      (label) * torch.pow(torch.clamp(self.margin - euclidean_distance, min=0.0), 2))

        return loss_contrastive

The loss function should be this right?:

def forward(self, output1, output2, label):
        euclidean_distance = F.pairwise_distance(output1, output2)
        loss_contrastive = 0.5*((1-label) * torch.pow(euclidean_distance, 2) +
                                      (label) * torch.pow(torch.clamp(self.margin - euclidean_distance, min=0.0), 2))

        return loss_contrastive.mean()
harveyslash commented 4 years ago

From my understanding, the only changes you have made are adding the .5 and then calculating the mean at the end. I dont think it changes anything. A constant multiplier to a loss does not affect training times (especially because I have used Adam optimiser)

I would be really interested in seeing the results. Could you give some numbers?