Closed rsanjaykamath closed 3 years ago
Hey @rsanjaykamath,
I cannot reproduce the error on master
. When running:
from transformers import AutoTokenizer, T5ForConditionalGeneration
model_name = "allenai/unifiedqa-t5-small" # you can specify the model size here
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = T5ForConditionalGeneration.from_pretrained(model_name)
I don't encounter any errors...could you try to update transformers to the newest version and try again?
Hi @patrickvonplaten ,
That's strange. I just tried it on Colab with the version 4.2.2 of transformers and the same error occurs again. Have you tried it on colab? or local machine?
I see it's the classic sentencepiece error - I should have better read your error message ;-)
Here the colab to show how it works: https://colab.research.google.com/drive/1QybYdj-1bW0MHD0cutWBPWas5IFEhSjC?usp=sharing
Ok got it. Installing sentencepiece and restarting the kernel did the trick for me.
Thanks for your help :) Closing the issue.
I think the error message should be more clear
I see it's the classic sentencepiece error - I should have better read your error message ;-)
Here the colab to show how it works: https://colab.research.google.com/drive/1QybYdj-1bW0MHD0cutWBPWas5IFEhSjC?usp=sharing
In case it helps someone...I got this error because I had a corrupted or missing file in the Llama3 model. I downloaded it again and it fixed it.
Environment info
transformers
version: 4.2.2Who can help
@mfuntowicz @patrickvonplaten
Information
Model I am using (Bert, XLNet ...): T5 The problem arises when using:
To reproduce
Steps to reproduce the behavior:
Expected behavior
The following code should load the model without errors.
Error
But the following error is obtained: