HobbitLong / SupContrast

PyTorch implementation of "Supervised Contrastive Learning" (and SimCLR incidentally)
BSD 2-Clause "Simplified" License
3.13k stars 536 forks source link

can supCon loss used in multi-label classification? #90

Open littttttlebird opened 3 years ago

littttttlebird commented 3 years ago

I have a text multi-label classification task,can i use supCon loss ? supCon loss is accumulated by every label view,for example: batch data label = [[1, 0, 1], [0, 1, 1], [1, 1, 0], [0, 1, 1] ] from view label 0, positive examples = {0, 2},negative samples = {1, 3} from view label 1, positive examples = {1, 2,3}, negative samples = {0} from view label 2, positive examples = {0, 1, 2}, negative samples = {2}

is this setting here reasonable ?

HobbitLong commented 2 years ago

This is an interesting question! I think in this case SupCon might not be as good as simply using binary cross-entropy for each label.

Ywandung-Lyou commented 2 years ago

How about using multiple classifiers for each label after a shared encoder (in this case ResNet), and that the loss is the sum of multiple SupCon? It is just an idea, and I have no idea whether it works.

dbushpw commented 1 year ago

Would it be sufficient to just modify the mask iand my input labels in such a way that the mask correctly identifies data points who share at least one class or is it necessary to modify more then just the mask in this loss function?