Open austinmw opened 9 months ago
Hi, the loss function is at https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/finetune/modeling.py#L98
Yes, we use CrossEntropyLoss with In-Batch Negative Sampling. We will use the provided negatives and if set use_inbatch_neg
to True (default value is True), we also use the passages from other queries as negatives.
how to change the loss function ? can u show me
Hi, what is the name of the loss function used for fine-tuning? And could you please point me to where it is defined?
Edit: looks like CrossEntropyLoss with In-Batch Negative Sampling? What happens if no negatives are provided, does it just assume all other examples are negatives anyway and result in worse performance? And if negatives are provided it will only construct batches with no context overlap?