dwzhu-pku / LongEmbed

LongEmbed: Extending Embedding Models for Long Context Retrieval (EMNLP 2024)
114 stars 6 forks source link

Adding rope modifications to Bert/ sentence-transformer models #5

Open sandeep-krutrim opened 1 month ago

sandeep-krutrim commented 1 month ago

Hi,

I want to take a sentence-transformer model( say xlmr) and extend its context length using rope. How to do this? Can you provide a code for this ?

dwzhu-pku commented 1 month ago

Hi @sandeep-krutrim , thanks for your interest in our work!

To apply the context extension strategies designed for RoPE, you first need a model that uses RoPE as its positional encoding. For models like XLM-R that adopt learned positional encoding, there isn't a straightforward way to apply the context extension strategies designed for RoPE unless you convert the positional encoding to RoPE and train the model from scratch :-)