Open DoctorSlimm opened 4 months ago
Hello!
Sentence Transformers doesn't have a "default" loss function per se, users always have to specify one. The current options are listed right here. They're all stored over here: https://github.com/UKPLab/sentence-transformers/tree/master/sentence_transformers/losses
Implementing a custom loss involves creating a class that:
torch.nn.Module
,model
parameter as the first option (this is the Sentence Transformer model) & optionally more parameters afterwards,forward
method that has the self, sentence_features: Iterable[Dict[str, Tensor]], labels: Tensor
signature. The sentence_features
is a list that has a dictionary of whatever the model requires (usually "input_ids" and "attention_mask"). You can look at other losses for examples of how this is used to get the embeddings.forward
method must return a loss value. It must be a Tensor
so that the model can learn using backpropagation.
Hiiii was wondering what the default loss function is? + how to implement custom in the framework?
ps: i know this is like totally random so apologies
Notably have been reading the [Atlas paper on fine tuning retrievers] (https://jmlr.org/papers/volume24/23-0037/23-0037.pdf) and the LOOL loss function is interesting: