huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
133.88k stars 26.78k forks source link

Pretrain BART MNLI model on Financial Phrasebank #13142

Closed bartmnli closed 3 years ago

bartmnli commented 3 years ago

Hi, I am trying to train/finetune the BART large model pretrained on MNLI on Financial Phrasebank but completely lost as I'm just a beginner.

from transformers import AutoModelForSequenceClassification, AutoTokenizer model = AutoModelForSequenceClassification.from_pretrained('facebook/bart-large-mnli') tokenizer = AutoTokenizer.from_pretrained('facebook/bart-large-mnli')

I couldnt find any code examples for tokenizing the input text from Financial Phrasebank. Different tutorials show different ways and I'm completely full.

Can anyone please please share any links to examples similar to this?

I was also trying to look for the finetuning code of the BART large MNLI finetuned on yahoo datset by Joe Davison @joeddav (https://huggingface.co/joeddav/bart-large-mnli-yahoo-answers) but couldn't find that code. Any suggestions or advice would be much appreciated.

Thanks in advance.

NielsRogge commented 3 years ago

Hi,

We like to keep Github issues for bugs/feature requests. For training-related questions, please use the forum. Many HuggingFace members are happy to help you there!

Thanks!

github-actions[bot] commented 3 years ago

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.