AI4Bharat / indicTrans

indicTranslate v1 - Machine Translation for 11 Indic languages. For latest v2, check: https://github.com/AI4Bharat/IndicTrans2
https://ai4bharat.iitm.ac.in/indic-trans
MIT License
119 stars 31 forks source link

Running into errors using Colab for inference #7

Closed drishanarora closed 3 years ago

drishanarora commented 3 years ago

Hi

Apologies for creating an issue out of the blue. I came across this codebase via https://indicnlp.ai4bharat.org/indic-trans/ and I am a big fan of this project.

I am running into some issues while running the code. I followed along the code in the colab given in the README file (as well as the PR with a few fixes: https://github.com/AI4Bharat/indicTrans/pull/6).

In particular,

  1. The script joint_translate.sh expects a file ../en-indic/vocab/bpe_codes.32k.SRC_TGT which does not exist. I assume that the file that is required is ../en-indic/vocab/bpe_codes.32k.SRC instead.

  2. After I change the file above, while running the model, I still get this error:

AssertionError: Could not infer model type from Namespace(_name='transformer_4x', ...

I assume this is because the model transformer_4x is a custom model, not in available models in fairseq/fairseq/models/__init__.py here.

Would it be possible to make sure that the code runs in the colab? Thank you so much!

drishanarora commented 3 years ago

Oh my bad - I see that you have changed the file joint_translate.sh in the PR too, which registers the custom model. Let me try again.

anoopkunchukuttan commented 3 years ago

The pull request is now merged