UKPLab / sentence-transformers

State-of-the-Art Text Embeddings
https://www.sbert.net
Apache License 2.0
15.21k stars 2.47k forks source link

multi-task-learning #2824

Open IcanDoItL opened 3 months ago

IcanDoItL commented 3 months ago

the example examples/training/quora_duplicate_questions/training_multi-task-learning.py Contrastive Loss, MultipleNegativesRankingLoss is not of the same magnitude. How to ensure its effectiveness? Is there any reference material available

IcanDoItL commented 3 months ago

the example examples/training/quora_duplicate_questions/training_multi-task-learning.py Comparing losses also uses BatchSamplers NO_DUPLICATES, Will it affect the effectiveness?