facebookresearch / fairseq

Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
MIT License
30.47k stars 6.41k forks source link

Adding source and target language parameters into TransformerModel.from_pretrained() function during translation? #3304

Open RavneetDTU opened 3 years ago

RavneetDTU commented 3 years ago

I am trying to translate English to Tamil model, model was trained on a custom dataset using fairseq.

Code

model = TransformerModel.from_pretrained( 'en_ta_model', checkpoint_file='en_ta.pt', bpe='subword_nmt', bpe_codes='en_ta_model/codes' )

Error:

Could not infer language pair, please provide it explicitly

I have used this same script for french, german, russian, etc in past. It never ask for languages explicitly. How to add source and target language parameters into TransformerModel.from_pretrained() function??

RavneetDTU commented 3 years ago

@alexeib can you please suggest something?

alexeib commented 3 years ago

try adding source_lang and target_lang params to from_pretrained (which correspond to fields in translation task config)?

stale[bot] commented 3 years ago

This issue has been automatically marked as stale. If this issue is still affecting you, please leave any comment (for example, "bump"), and we'll keep it open. We are sorry that we haven't been able to prioritize it yet. If you have any new additional information, please include it with your comment!

kyoto7250 commented 1 year ago

It's a past issue, but I also had the same problem, so I post the code.

model = TransformerModel.from_pretrained(
    'checkpoints',
    'checkpoint_best.pt',
    'tokenized',
    bpe='subword_nmt',
    bpe_codes='code',
    source_lang="en",
    target_lang="de"
)

text = 'Hello'
model.translate(text, beam=5)