microsoft / unilm

Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
https://aka.ms/GeneralAI
MIT License
20.18k stars 2.55k forks source link

How to fine-tune E5-mistral-7b-instruct? #1582

Open Zheng-Jay opened 4 months ago

Zheng-Jay commented 4 months ago

Describe Thank you for your team's contribution! I would like to fine-tune E5-mistral-7b-instruct for tasks that interest me. Do you have plans to open-source training code? Alternatively, are there similar codes available for reference and learning?

yuetan1988 commented 3 months ago

I also did some experiments to finetune the model, and I refers to these: