HobbitLong / SupContrast

PyTorch implementation of "Supervised Contrastive Learning" (and SimCLR incidentally)
BSD 2-Clause "Simplified" License
2.98k stars 525 forks source link

supervised contrastive loss function #20

Open YuBeomGon opened 3 years ago

YuBeomGon commented 3 years ago

input tensor to SupConLoss need to have 3 dimenstion. batch size, 2(two features made from augmentation), z_dim(128)

if I dont want augmentation, [batch size, 1, z_dim(128)] is ok for SupConLoss ??

loss value is almost nan for batch(32) thanks

HobbitLong commented 3 years ago

If not, you need to modify it accordingly. But the logic should not be hard.

JuliaWolleb commented 3 years ago

Hi, I have the same problem. Did you mean that the loss function SupConLoss should be modified? If yes, how? Or did you mean only the input dimensions to[batch size, 1, z_dim(128)]?

YuBeomGon commented 3 years ago

​ Hi i am sorry. I am at rest now. ​ Yes, In may case, i changed input size to [batch, 1, dim] And in that case, denum could be zero if there is no same label. So you should add some small value, like 0.0000001 ​ Thanks ​ -----Original Message----- From: "JuliaWolleb"notifications@github.com To: "HobbitLong/SupContrast"SupContrast@noreply.github.com; Cc: "YuBeomGon"cleo8131@naver.com; "Author"author@noreply.github.com; Sent: 2020-09-24 (목) 15:48:11 (GMT+09:00) Subject: Re: [HobbitLong/SupContrast] supervised contrastive loss function (#20)

Hi, I have the same problem. Did you mean that the loss function SupConLoss should be modified? If yes, how? Or did you mean only the input dimensions to[batch size, 1, z_dim(128)]? — You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe.