-
Hy @tomaarsen . I just create a new issues.
I implement the custom `Improve contrastive loss` using this paper. https://arxiv.org/abs/2308.03281
So my question, is the loss are already implemented…
-
Dear,
I am trying to understand your custom contrastive loss class. How I understand it, it correctly computes the positives by shifting the diagonal by batch_size and - batch_size to compute the …
-
Hello! I am currently attempting to replicate the work presented in your paper and I am very grateful for the code you have provided. However, I have encountered an issue regarding the contrastive los…
-
Hi, I am currently using your source code on the node classification task as a baseline for testing. However, I noticed that when I ran the code on the provided datasets without any modification, the …
-
Hello, thanks for your excellent work.
I try to use contrastive learning in revelant object detection tasks (e.g. semi-supervised object detection), and I write my contrastive loss code by referering…
-
Hello, thank you for your great work. Could you please tell me where the Multi-label contrastive loss in your paper is implemented? I really can't find it, thank you very much.
-
According to the paper, the contrastive loss weight is set to 1e-5 * num_train_steps.
However, in the given config, the 'loss_weights: [10,1]', where 1 is the contrastive loss weight.
May I know…
-
The code is not clear about the CCL mentionned in the paper. It is not mentionned in the losses.py file. Is it possible to further explain where in the code this particular loss is used ?
-
-
Hi! I found something a bit confusing that you might be able to clarify. For the contrastive loss implemented in "LecbertForPreTraining" class, you use a sigmoid as the activation function before appl…