SeanLee97 / AnglE

Train and Infer Powerful Sentence Embeddings with AnglE | 🔥 SOTA on STS and MTEB Leaderboard
https://arxiv.org/abs/2309.12871
MIT License
397 stars 30 forks source link

Fine-tune LLM for WhereIsAI/UAE-Large-V1 embeddings First ? #50

Open sergiosolorzano opened 4 months ago

sergiosolorzano commented 4 months ago

Hi,

To use the generated embeddings from WhereIsAI/UAE-Large-V1 in an LLM model , do I first need to fine tune a pre-trained LLM model with AnglE so that WhereIsAI/UAE-Large-V1 embeddings are compatible with an LLM? e.g.

angle = AnglE.from_pretrained('NousResearch/Llama-2-7b-hf', pretrained_lora_path='SeanLee97/angle-llama-7b-nli-v2')

Thank you !