Closed ShivanshuPurohit closed 3 years ago
You can decode them back to a string using T5Tokenizer
, like so:
tokenizer.decode(outputs.squeeze().tolist(), skip_special_tokens=True)
Btw, for a really good guide on the different generation strategies of models like T5, see this blog post: https://huggingface.co/blog/how-to-generate
This post was really helpful, thanks!
Environment info
transformers
version: 4.3.2Who can help
Information
Model I am using (Bert, XLNet ...): T5
The problem arises when using:
To reproduce
Steps to reproduce the behavior:
Expected behavior
To see the generated text. Rather, the model outputs a torch Tensor like so
tensor([[ 0, 363, 19, 8, 1784, 13, 1473, 58, 1]])
How do I get words out of it rather than a tensor?