Closed sixmilesroad closed 3 years ago
Yes, you can.
You find some examples here: https://github.com/UKPLab/sentence-transformers/tree/master/examples/training/avg_word_embeddings
Specifically: https://github.com/UKPLab/sentence-transformers/blob/master/examples/training/avg_word_embeddings/training_stsbenchmark_cnn.py
from sentence_transformers import SentenceTransformer, models
word_embedding_model = models.Transformer('bert-base-uncased', max_seq_length=256) pooling_model = models.Pooling(word_embedding_model.get_word_embedding_dimension())
model = SentenceTransformer(modules=[word_embedding_model, pooling_model])
In this demo , you put a pooling on word_embedding . can i put some layers between word_embedding and pooling layer?