UKPLab / sentence-transformers

State-of-the-Art Text Embeddings
https://www.sbert.net
Apache License 2.0
15.17k stars 2.47k forks source link

about model #669

Closed sixmilesroad closed 3 years ago

sixmilesroad commented 3 years ago

from sentence_transformers import SentenceTransformer, models

word_embedding_model = models.Transformer('bert-base-uncased', max_seq_length=256) pooling_model = models.Pooling(word_embedding_model.get_word_embedding_dimension())

model = SentenceTransformer(modules=[word_embedding_model, pooling_model])

In this demo , you put a pooling on word_embedding . can i put some layers between word_embedding and pooling layer?

nreimers commented 3 years ago

Yes, you can.

You find some examples here: https://github.com/UKPLab/sentence-transformers/tree/master/examples/training/avg_word_embeddings

Specifically: https://github.com/UKPLab/sentence-transformers/blob/master/examples/training/avg_word_embeddings/training_stsbenchmark_cnn.py