-
[MultipleNegativesRankingLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss)
says:
The loss will use for the pair (a_i, p_i) all p_j for j …
-
Dear Dr. Zhong,
I have been trying to test HamGNN for a system of interest, but encountered issues to reduce the training error. See the following correlation plot. I would really appreciate if you h…
-
Hi ,kaiyang:
i want to reproduce the AGW method and i found Weighted Regularization Triplet return loss and correct ,so should i rewrite a new image engine follow the torchreid [guide?](https://kaiya…
-
I modified your code for my problem. I added regularization terms in Embedding layers. But when I train the model, both test loss and val loss go to 0.5. I guess this is because both user and item lat…
-
This question is very broad and theoretical so apologies for that!
I would love to learn more about various loss functions which could be implemented in `tfrs` - when looking at other libs such `li…
-
**Environment:**
1. Framework: PyTorch
2. Framework version: 1.9.0
3. Horovod version: 0.23.0
4. MPI version: 3.1.2
5. CUDA version: 10.1
6. NCCL version:
7. Python version: 3.6.2
8. Spark / …
-
- [x] try out clustering with triplet loss
- [ ] try out magnetic loss if it worked good and there is enough time
Images:
* 224x224 with resnet18
* Output 128
* same learning rate as for th…
-
https://arxiv.org/abs/1810.04652
-
我的理解是应该pos从小到大排序,neg由大到小排序,这样组成的triplet,loss会比较大
或者考虑所有achor,pos,neg的组合,选择loss比较大的进行训练
-
I am training a custom dataset using cosine-softmax, but all triplet loss, magnet loss, and cross entropy loss are being shown on tensorboard.
Which one should I be looking at when using cosine-so…