I am running into some issues while running the code. I followed along the code in the colab given in the README file (as well as the PR with a few fixes: https://github.com/AI4Bharat/indicTrans/pull/6).
In particular,
The script joint_translate.sh expects a file ../en-indic/vocab/bpe_codes.32k.SRC_TGT which does not exist. I assume that the file that is required is ../en-indic/vocab/bpe_codes.32k.SRC instead.
After I change the file above, while running the model, I still get this error:
AssertionError: Could not infer model type from Namespace(_name='transformer_4x', ...
I assume this is because the model transformer_4x is a custom model, not in available models in fairseq/fairseq/models/__init__.pyhere.
Would it be possible to make sure that the code runs in the colab? Thank you so much!
Hi
Apologies for creating an issue out of the blue. I came across this codebase via https://indicnlp.ai4bharat.org/indic-trans/ and I am a big fan of this project.
I am running into some issues while running the code. I followed along the code in the colab given in the README file (as well as the PR with a few fixes: https://github.com/AI4Bharat/indicTrans/pull/6).
In particular,
The script
joint_translate.sh
expects a file../en-indic/vocab/bpe_codes.32k.SRC_TGT
which does not exist. I assume that the file that is required is../en-indic/vocab/bpe_codes.32k.SRC
instead.After I change the file above, while running the model, I still get this error:
AssertionError: Could not infer model type from Namespace(_name='transformer_4x', ...
I assume this is because the model
transformer_4x
is a custom model, not in available models infairseq/fairseq/models/__init__.py
here.Would it be possible to make sure that the code runs in the colab? Thank you so much!