huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
135.19k stars 27.06k forks source link

How to change training epochs when using run_summarization.py #11641

Closed xuyeliu closed 3 years ago

xuyeliu commented 3 years ago

Who can help

Models:

Information

I am using t5-large and t5-base to train my customer model with my customer csv dataset through running run_summarization.py. But I found that t5-small's performance is better than t5-base and t5-large. I think that is because we only train 3 epochs in run_summarization.py. Can you guys tell me how to change the training epochs?

I'm not sure whether my thinking is correct. Feel free to provide more suggestion. After all, it is strange that the performance of t5-small is better than t5-large and t5-base.

Thank you very much!!!

patil-suraj commented 3 years ago

Hi there,

run_summarization.py uses Trainer, you can pass the --num_train_epochs to control the number of epochs. Please find the docs here.

Also please use the forum to ask such questions, issues are for bugs and feature requests. Thanks!

github-actions[bot] commented 3 years ago

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.