Closed TheTimKiely closed 3 years ago
Hello! The bert_abs
example is not maintained anymore, and should be moved to examples/contrib/legacy
.
The recommended way of training sequence-to-sequence models is described in the examples/seq2seq/README.md
file. What are you trying to do with bertabs
, so that we may help you find what you need?
Hi! Thanks for your response. I'm just starting to experiment with abstractive text summarization. Is this something I should look for in the Hugging Face tools and samples? Thanks again, Tim
I believe abstractive text summarization is implemented in the seq2seq
examples, as the XSUM models were trained to do abstractive text summarization.
Have you taken a look at the summarization examples in https://github.com/huggingface/transformers/tree/master/examples/seq2seq?
@patil-suraj may also be of help.
Thanks again!
I’ll take a look at the seq2seq examples.
-Tim
This issue has been automatically marked as stale and been closed because it has not had recent activity. Thank you for your contributions.
If you think this still needs to be addressed please comment on this thread.
Environment info
transformers
version: 3.3.1Who can help
@patil-suraj
Information
Following the example in the seq2seq/bertabs readme.md. I am getting this error:
In a debugger, I see that the 'select_indices' parameter is a tensor of floats.
I don't understand the beam mechanism, so I don't know where to start troubleshooting this.
Any help would be great!
-Tim