huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
134.29k stars 26.85k forks source link

how to fine tune custom dataset using coreference pretrained model #20161

Closed SavitaKumariPandit closed 1 year ago

SavitaKumariPandit commented 1 year ago

the pre-trained model available in hugging face hub "nreimers/mMiniLMv2-L12-H384-distilled-from-XLMR-Large" how to fine tuning with own custom dataset.

NielsRogge commented 1 year ago

Hi,

For that I'll refer to the training guide of Sentence Transformers: https://www.sbert.net/docs/training/overview.html.

SavitaKumariPandit commented 1 year ago

Hi,

For that I'll refer to the training guide of Sentence Transformers: https://www.sbert.net/docs/training/overview.html.

Hi @NielsRogge , I have to use my own dataset for co-reference resolution task, so above mention suggestion will work on pretrained model of "nreimers/mMiniLMv2-L12-H384-distilled-from-XLMR-Large". after finetune I got output folder which contains 1_Pooling ,config.json, config_sentence_transformers.json,eval,modules.json,pytorch_model.bin,README.md,sentence_bert_config.json sentencepiece.bpe.model,special_tokens_map.json, tokenizer.json, tokenizer_config.json and this folder I saved as zip file and load path of fine tune model for prediction but when use for prediction it is not working