Open panteaHK opened 3 weeks ago
You can refer to bge-en-icl finetune
I looked at the examples but it's still unclear to me how to set up the fine-tuning to do MLM or Contrastive Learning?
The bge-en-icl
model does not require pre-training; it can be directly fine-tuned.
I want to first continue pre-training of
bge-en-icl
model before fine-tuning it. Could you please refer me to an example of how to do that? I think the examples are no longer in your repo.