I'm a little confused with the terminology. I am using the model named 'all-mpnet-base-v2'.
When speaking to my team is it fair to say:
We are using S-BERT, a sentence transformer trained on over 1B sentence pairs.
Or
We decided to go with all-mpnet-base-v2 instead of S-BERT.
This is a sentence-transformer based on BERT architecture, and trained on 1 Billion sentence pairs, that shown to have a high performance on the S-BERT website.
Hi there,
I'm a little confused with the terminology. I am using the model named 'all-mpnet-base-v2'.
When speaking to my team is it fair to say:
Or
Why can't I see a model named S-BERT anywhere?
Thanks,
Viraj