Open matallanas opened 9 months ago
Hello!
I'm afraid that intfloat/e5-mistral-7b-instruct
is not currently compatible with Sentence Transformers due to its left-padding tokenizer.
I also would not commonly recommend a model of this size for classification, especially because it'll take a lot of memory to finetune it.
A good alternative & strong model is BAAI/bge-large-en-v1.5.
My problem is that I wanted to create a intent classification with prompts from a chat. The length of each text is variable and sometimes with more than 512 tokens and in different languages. That was reason to use intfloat/e5-mistral-7b-instruct, any other model that you can recommend for this problem? Thanks again.
I wanted to ask, if anyone has used the
intfloat/e5-mistral-7b-instruct
as base model for a multiclass classification task. I am trying to use but I have a problem with the GPU memory and I don't know if anyone has been able to use it properly. Thank you so much in advance.