UKPLab / sentence-transformers

State-of-the-Art Text Embeddings
https://www.sbert.net
Apache License 2.0
15.45k stars 2.5k forks source link

which objective function should i use #836

Open sixmilesroad opened 3 years ago

sixmilesroad commented 3 years ago

in the paper you order three objective functions ,1:Classification Objective Function 2:Regression Objective Function 3:Triplet Objective Function。 now the data format is sentence1 sentence2 similarity score。which Objective Function should i use 。in the paper you test SBERT embedding on SentEval,in this test which objective function do you use? if you use Classification Objective Function does it mean that if use Regression Objective Function the SBERT embedding can not perform good on SentEval?

nreimers commented 3 years ago

Have a look at our documentation: https://www.sbert.net/docs/package_reference/losses.html

For your task the cosine similarity loss sounds right.

The loss function is completely independent of SentEval

sixmilesroad commented 3 years ago

if the data format i have is sent1 sent2 lable(lable can be 0 or 1) i want know SoftmaxLossand ContrastiveLoss which should i try

nreimers commented 3 years ago

ContrastiveLoss seems more intuitive

Check this example: https://github.com/UKPLab/sentence-transformers/tree/master/examples/training/quora_duplicate_questions

sixmilesroad commented 3 years ago

i want know how to choose lossfunction. depends on the data format? or depends on the generated embedding use which task

sixmilesroad commented 3 years ago

in your paper in SentEval part,for MRPC dataset which lossFunction do you use?