huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
133.17k stars 26.59k forks source link

i have used t5_base for abstractive summarization but it is not giving good results,Could you please give me solution for this #6408

Closed gopal354 closed 3 years ago

gopal354 commented 4 years ago

🖥 Benchmarking transformers

Benchmark

Which part of transformers did you benchmark?

Set-up

What did you run your benchmarks on? Please include details, such as: CPU, GPU? If using multiple GPUs, which parallelization did you use?

Results

Put your results here!

patil-suraj commented 4 years ago

Hi @gopal354 , this depends on lot of factors. What is the domain of your dataset ? There are many other summrization models available on the model hub trained on different datasets. You can try them as well. Or if you have a dataset, then you can further fine-tune these models on your domain.

patil-suraj commented 4 years ago

Would be nice if the detailed question is written in the description box rather than title and use the relevant issue topic (this should be Questions & Help and not Benchmarking transformers). This will help the team and contributors to act faster on the issue :)

stale[bot] commented 4 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.