I need to finetune a embedding model and it is a Bi-encoder.I hope to reduce the cost of storage to save the checkponts.So I want to use lora.But I don't find lora arguments in the FlagEmbedding/baai_general_embedding/finetune/arguments.py.So how could I use the lora method? Thanks for your great work.
I need to finetune a embedding model and it is a Bi-encoder.I hope to reduce the cost of storage to save the checkponts.So I want to use lora.But I don't find lora arguments in the
FlagEmbedding/baai_general_embedding/finetune/arguments.py
.So how could I use the lora method? Thanks for your great work.