Open andreamorgar opened 2 years ago
I had the same doubt, I want to experiment with models using SBERT and pass only a text through an encoder with fewer lines of code.
@andreamorgar No, You can choose another ways. https://www.sbert.net/docs/package_reference/losses.html#mseloss
Can I train a sentence-transformer in the form [sentence1, label]? Would it be correct? Can't find neither in the documentation or hugging face nothing related to this specific way of fine-tuning SBERT. It always relate to two sentences per training item!