-
Currently some examples in our dataset are not fully correct. We should find the problem samples and update them.
I suggest to store the examples in the issue.
@pasukka
-
I'm trying to run beir/examples/retrieval/evaluation/dense/evaluate_sbert_multi_gpu.py. Doing do I end up with the below error.
Traceback (most recent call last):
File "evaluate_sbert_multi_gpu.…
-
Hi,
I am trying to use "gpt2" hugginface model with the SBERT code, where it gives NoneType error in the get_sentence_features function.
word_embedding_model = models.Transformer('gpt2')
I read a…
-
Hello, thanks for the code. I cannot find the detail in the paper as to which sbert model was used? Also, could you clarify how to use it in the training pipeline?
-
Did you try SimCSE's supervised training objective in-domain on USEB?
Would be interesting to compare to SBERT-supervised...!
-
Hi,
I found result on STS Benchmark without training sbert -
sbert model include bert layer + pooling layer and I evaluate it using sentEval.
and the same evaluation on USE. Below are the results…
-
I'm looking for quantize the embeddings to speed up the process of indexing / searching / etc.
For example in `sentence_transformers` there is [quantize_embeddings](https://sbert.net/docs/package_r…
-
[yunjinchoidev]
![232969785-a39a59d7-077a-4e0b-8d26-d5194259df72](https://user-images.githubusercontent.com/89494907/233949591-59c2e38a-5562-412a-ab94-83f970fca1cd.png)
![232970200-9bbecfb8-833d…
-
How could SBERT (before fine-tuning) work without supervision? I don't quite understand the content in the paper section 4.
I mean, is it fair to compare BERT without fine-tuning and SBERT after fi…
-
Hey,
I have a question regarding the pre-trained models listed on https://www.sbert.net/docs/pretrained_models.html:
Are these models still based on your paper "Sentence-BERT: Sentence Embeddings…