UKPLab / sentence-transformers

State-of-the-Art Text Embeddings
https://www.sbert.net
Apache License 2.0
14.78k stars 2.43k forks source link

Why is "max_position_embeddings" 514 in sbert where as 512 in bert #1562

Open omerarshad opened 2 years ago

omerarshad commented 2 years ago

Why is "max_position_embeddings" different in sbert then in Bert?

nreimers commented 2 years ago

Can you point to the respective code?

Some models allow 514 tokens to account for special tokens