LalitaDeelert / lalita-mt-zhth

Apache License 2.0
4 stars 0 forks source link

Empty output from model.generate. #16

Closed peerachetporkaew closed 3 years ago

peerachetporkaew commented 3 years ago

Hi,

I test the model with this code and found "" as the output. Do you have any idea ?

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

tokenizer_zhth = AutoTokenizer.from_pretrained("Lalita/marianmt-zh_cn-th")
model_zhth = AutoModelForSeq2SeqLM.from_pretrained("Lalita/marianmt-zh_cn-th")
tokenizer_thzh = AutoTokenizer.from_pretrained("Lalita/marianmt-th-zh_cn")
model_thzh = AutoModelForSeq2SeqLM.from_pretrained("Lalita/marianmt-th-zh_cn")

src_text = ["这个问题还未完全清晰, 因为这还是新的研究领域"]
a = tokenizer_zhth(src_text, return_tensors="pt", padding=True)
print(a)

translated = model_zhth.generate(**a)

print(translated)
print([tokenizer_zhth.decode(t, skip_special_tokens=True) for t in translated])

I use pytorch 1.8 and transformers 4.6.0 as suggested in HuggingFace website. Thanks

peerachetporkaew commented 3 years ago

The problem is solved after move my environment to Ubuntu ! However, I cannot run the pretrained in Win10. Thanks for your contribution.