Closed renesteeman closed 8 months ago
Hi @renesteeman. Not sure I get the question, let me try to rephrase.
So you have a potential head implemented in TF and this head would not currently match with SetFit as SetFit expects either a scikit-learn or a PyTorch head. However you would still want to use the embeddings from the sentence-transformer trained with SetFit and pass them to the TF head. Is that right?
Technically, the sentence-transformer part can be trained separately from the head in SetFit, so if you train a SetFit model on your data with a scikit-learn head (the default option, as far as I remember), then you can use the .encode()
method after training to embed your dataset and pass the embeddings to your TF head.
Hi @kgourgou. That is exactly what I meant, thank you!
I have been using embedding models to create the inputs for a highly custom TF-based classification model that would not fit with the current SetFit requirements. Would there be a way to use the finetuned embeddings without providing a classification head?