-
Hi,
I tried to train on the wikipedia dataset with triplet loss using the code given here: https://github.com/UKPLab/sentence-transformers/blob/master/examples/training/other/training_wikipedia_se…
-
I have a sequence classification dataset, which I want to use to make sentence embeddings using triplet loss. How should I restructure the dataset to make it compatible with the codebase.
Also how s…
-
Training using google/mt5-base as the base model with fp16 and the triplet loss on all-nli data (following the [ example with trainer](https://www.sbert.net/docs/sentence_transformer/training_overview…
-
File "train.py", line 230, in train_model
loss5 = triplet(p, labels)[0]
File "/home/zzy/anaconda3/envs/baseline/lib/python3.6/site-packages/torch/nn/modules/module.py", line 547, in __call__…
-
Hi,
Can you please explain the way you implement triplet loss a little bit. I implement triplet loss like that. But I do not understand your implementation.
`class TripletLoss(nn.Module):
"""
…
-
Hi~
It's a nice job. The proposed CC loss seems to narrow the modality gap, which is often done by triplet loss. So, how about replacing CC with the triplet loss?
Thanks~
-
# Implementing Triplet Losses for Implicit Feedback Recommender Systems with R and Keras - Nan Xiao | 肖楠
[https://nanx.me/blog/post/triplet-loss-r-keras/](https://nanx.me/blog/post/triplet-loss-r-k…
-
Out of curiosity (and for a comparison to [Tevatron](https://github.com/texttron/tevatron/tree/main)) I tried running the [MSMarco MRNL](https://github.com/UKPLab/sentence-transformers/blob/master/exa…
-
郑博士您好!
您在论文中提到有minibatch中50%的图片用作Negative mining ,并且使用了度量学习的方法.请问Negative mining 以及 triplet loss对应于代码中的何处?
-
Hi @davidsandberg! Thanks for your work on this repo!
When training with triplet loss using the recommended hyperparameters in the wiki, what kind of results were obtained? It'd be great if I could…