xlang-ai / instructor-embedding

[ACL 2023] One Embedder, Any Task: Instruction-Finetuned Text Embeddings
Apache License 2.0
1.85k stars 134 forks source link

hard coding problem of max sequence length #27

Closed silverstar0727 closed 1 year ago

silverstar0727 commented 1 year ago

Due to this line, even if I run the code below, I still got max_seq_length fixed at 512. model.max_seq_length = 1024

Is it because I can't change the max_seq_length? If not, it would be right to erase this line.

silverstar0727 commented 1 year ago

oh.... This line tricked me into believing that max_seq_length is 512...

I got a 1024 when I ran print(model.max_seq_length) and it's actually right.

Harry-hash commented 1 year ago

Yes, the line you mentioned is only for the initialization, and you can change the maximum sequence length by overriding it.