FlagOpen / FlagEmbedding

Retrieval and Retrieval-augmented LLMs
MIT License
7.01k stars 512 forks source link

How to finetune a Bi-encoder with lora method? #835

Open bg51717 opened 4 months ago

bg51717 commented 4 months ago

I need to finetune a embedding model and it is a Bi-encoder.I hope to reduce the cost of storage to save the checkponts.So I want to use lora.But I don't find lora arguments in the FlagEmbedding/baai_general_embedding/finetune/arguments.py.So how could I use the lora method? Thanks for your great work.

staoxiao commented 4 months ago

Sorry, the current code doesn't support fine-tuning with Lora.