-
We need a fast way to find related documents to the given user query. I suggest a combination of:
- TF-IDF vector space or the enhanced BM25 to improve speed
- Sentence BERT to embed sentences of e…
-
I'm trying to run beir/examples/retrieval/evaluation/dense/evaluate_sbert_multi_gpu.py. Doing do I end up with the below error.
Traceback (most recent call last):
File "evaluate_sbert_multi_gpu.…
-
Hello, thanks for the code. I cannot find the detail in the paper as to which sbert model was used? Also, could you clarify how to use it in the training pipeline?
-
As specified in this issue: https://github.com/UKPLab/sentence-transformers/issues/350#issuecomment-757687915
The authors of SBERT recommend using CrossEncoders for sentence classification.
-
I have trained SBERT model from scratch using the code [https://github.com/UKPLab/sentence-transformers/blob/master/examples/training_transformers/training_nli.py](train_nli) and [https://github.com/…
ghost updated
1 month ago
-
Sentence Transformers is one of the easiest libraries to train models for STS/other tasks.Given the fact that people would consequently use SBERT with BEIR given the authors involvement in both it wou…
-
## Summary:
Message passing and feature aggregation are effective techniques for improving the quality of topic clusters in a graph-based topic modeling system. Message passing involves propagating …
-
https://arxiv.org/abs/1908.10084
https://www.sbert.net/
-
In the [SBERT repository](https://www.sbert.net/examples/training/adaptive_layer/README.html), I found the adaptive layers method referenced in this paper: [_**ESE**: Espresso Sentence Embeddings_](ht…
-
Hey,
I have a question regarding the pre-trained models listed on https://www.sbert.net/docs/pretrained_models.html:
Are these models still based on your paper "Sentence-BERT: Sentence Embeddings…