Closed jorgeutd closed 2 years ago
Hi,
LongT5 is only available in Transformers v4.20.
I guess it got released today. Thank you @NielsRogge you are the GOAT.
@NielsRogge do I still need to use the prefix:
if model_checkpoint in ["t5-small", "t5-base", "t5-larg", "t5-3b", "t5-11b"]: prefix = "summarize: " else: prefix = ""
for summarization with the LongT5 model or not?
Thanks,
Jorge
@NielsRogge Also I think the model name in the long-t5 README is wrong (https://huggingface.co/google/long-t5-tglobal-large). It says
tokenizer = AutoTokenizer.from_pretrained("google/longt5-tglobal-large")
model = LongT5Model.from_pretrained("google/longt5-tglobal-large")
But I think it should be
tokenizer = AutoTokenizer.from_pretrained("google/long-t5-tglobal-large")
model = LongT5Model.from_pretrained("google/long-t5-tglobal-large")
Feel free to open an issue/PR on the repo on the hub!
The code examples were fixed. Closing this issue!
@jorgeutd Hi! Have you figured out of the question? In the official document, it said that LongT5 does not use prefix. How do we use it in different down tasks? Thanks.
Hello Team,
I am getting the following error when I am trying to import the new LongT5Model into my notebook:
ImportError Traceback (most recent call last) /tmp/ipykernel_55187/343105862.py in <cell line: 3>() 1 #from transformers import AutoTokenizer, AutoModelForSeq2SeqLM 2 ----> 3 from transformers import AutoTokenizer, LongT5Model 4 5 tokenizer = AutoTokenizer.from_pretrained("google/longt5-tglobal-base")
ImportError: cannot import name 'LongT5Model' from 'transformers' (/home/ec2-user/anaconda3/envs/pytorch_p38/lib/python3.8/site-packages/transformers/init.py)
Transformers version: 4.19.4 Python version: 3.8