pytorch / pytorch

Tensors and Dynamic neural networks in Python with strong GPU acceleration
https://pytorch.org
Other
84.6k stars 22.79k forks source link

Addition of Siamese loss or contrastive loss function #34879

Open Hemantr05 opened 4 years ago

Hemantr05 commented 4 years ago

🚀 Feature

I would like to contribute to the pytorch (in python and c++) by adding the contrastive loss/siamese_margin_loss function which is usually implemented for siamese networks

Motivation

I have been working on siamese networks and have implemented constrastive loss/siamese_margin_loss to measure similarity between bottleneck embeddings of the autoencoder.

Pitch

The output of the loss function is similar to that of TripletMarginLoss but for siamese network instead of triplet network.

It returns a float value signifying the distance between the bottleneck embedding.

Additional context

This loss implemented for all siamese network implementations (such as facial similarity, Omniglot examples)

Hemantr05 commented 4 years ago

May I add the loss function?

mrshenli commented 4 years ago

Adding this to triage review meeting to discuss how can we should proceed with this feature request and contribution.

ezyang commented 4 years ago

This one should go to the "needs research" queue

gchanan commented 4 years ago

1430 citations since 2006.

Hemantr05 commented 4 years ago

1430 citations since 2006.

Yes

Hemantr05 commented 4 years ago

The loss is very widely used for siamese networks

Hemantr05 commented 4 years ago

@pbelevich @ezyang @gchanan @mrshenli What's the conclusion sir/madam?

soumith commented 4 years ago

it seems like a reasonable addition.

The loss ends up being something like reference:

euclidean_distance = F.pairwise_distance(output1, output2, keepdim = True)
loss_contrastive = torch.mean((1-label) * torch.pow(euclidean_distance, 2) +
                                      (label) * torch.pow(torch.clamp(self.margin - euclidean_distance, min=0.0), 2))

Let's use the API F.siamese_loss and nn.SiameseLoss unless you have better ideas. I think ContrastiveLoss is too broad of a term, as contrastive losses are of various kinds.

As with all contributions, we would need unit tests, docstrings etc.

thanks for your effort.

Hemantr05 commented 4 years ago

it seems like a reasonable addition.

The loss ends up being something like reference:

euclidean_distance = F.pairwise_distance(output1, output2, keepdim = True)
loss_contrastive = torch.mean((1-label) * torch.pow(euclidean_distance, 2) +
                                      (label) * torch.pow(torch.clamp(self.margin - euclidean_distance, min=0.0), 2))

Let's use the API F.siamese_loss and nn.SiameseLoss unless you have better ideas. I think ContrastiveLoss is too broad of a term, as contrastive losses are of various kinds.

As with all contributions, we would need unit tests, docstrings etc.

thanks for your effort.

F.siamese_loss and nn.SiameseLoss is perfect. Thanks for considering it

Will perform and provide unit tests, docstrings, etc..

Thank you for your support

henrywallace commented 4 years ago

See also ContrastiveLoss implementation from https://github.com/KevinMusgrave/pytorch-metric-learning/blob/aa0e4dd375d9c233eddd184835ff2a51b2e59102, among others.

Hemantr05 commented 4 years ago

See also ContrastiveLoss implementation from https://github.com/KevinMusgrave/pytorch-metric-learning/blob/aa0e4dd375d9c233eddd184835ff2a51b2e59102, among others.

Will do @henrywallace