Closed puraminy closed 3 years ago
Hi there,
the summarization and translation examples supports fine-tuning T5 and mT5 (and other seq2seq models in the lib). Please take a look at the readme and the script.
The scripts are easily modifiable to support training on any seq2seq task.
Also there are multiple notebook on T5 training in community notebooks section. Hope that helps.
Thank you very much! I haven't found the first examples which are up to date too.
I found community notebooks later, some of them are old. Maybe a recent one in the main notebooks is a good idea.
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
Why there is no training example for T5 or MT5??? Could you please give me a link to an example? I had a hard time to write a code with various errors: This is my code:
I don't know how to feed the labels to this model...
And this is error:
@sgugger @patrickvonplaten @patil-suraj