facebookresearch / ELI5

Scripts and links to recreate the ELI5 dataset.
Other
318 stars 40 forks source link

Using the pre-trained model #15

Closed sravaniprakash9 closed 5 years ago

sravaniprakash9 commented 5 years ago

when I tried this command : cat testing_files/output_for_multitask_bpe.txt | python ~/fairseq/interactive.py ~/fairseq/data-bin/eli5_data --path multitask_checkpoint.pt --task translation --batch-size 16 --nbest 1 --beam 5 --source-lang multitask_source_bpe --target-lang multitask_target_bpe --nbest 1 --prefix-size 0 --remove-bpe --max-len-b 500 --max-len-a 0 --min-len 250 --buffer-size 1 --batch-size 1 --no-repeat-ngram-size 3

am getting an error like ModuleNotFoundError: No module named 'fairseq.models.levenshtein_transformer'

Is this fairseq installation error or am I missing something? Below are the logs: Logs Traceback (most recent call last): File "generate.py", line 12, in from fairseq import bleu, checkpoint_utils, options, progress_bar, tasks, utils File "/root/sravani/ELIF_new/fairseq/fairseq/init.py", line 10, in import fairseq.models # noqa File "/root/sravani/ELIF_new/fairseq/fairseq/models/init.py", line 128, in module = importlib.import_module('fairseq.models.' + model_name) File "/root/anaconda3/lib/python3.7/importlib/init.py", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "/root/sravani/ELIF_new/fairseq/fairseq/models/insertion_transformer.py", line 11, in from fairseq.models.levenshtein_transformer import ( ModuleNotFoundError: No module named 'fairseq.models.levenshtein_transformer'

Additional context Add any other context about the problem here. (like proxy settings, network setup, overall goals, etc.)

huihuifan commented 5 years ago

Hm, are you updated to the latest version of fairseq? That looks like a problem with the fairseq library. The levenshtein transformer for non autoregressive translation was recently added. Could you try pulling from fairseq again to update your version?

sravaniprakash9 commented 5 years ago

Hi,

Thank you for the reply. I have fairseq 0.8.0 version running. I think that's the latest. I am having same problem even after re installing fairseq.

I installed like below:

Installing from source

To install fairseq from source and develop locally:

git clone https://github.com/pytorch/fairseq cd fairseq pip install --editable .

huihuifan commented 5 years ago

Hm, strange. The error here:

File "/root/sravani/ELIF_new/fairseq/fairseq/models/insertion_transformer.py", line 11, in from fairseq.models.levenshtein_transformer import ( ModuleNotFoundError: No module named 'fairseq.models.levenshtein_transformer'

is unrelated to ELI5 and seems like a problem with the insertion_transformer.py in fairseq. We definitely don't use this at all. Looking at the fairseq repo, there do seem to be some issues that the latest commits are broken and they are working on fixing it. Could you try to comment out this line please for now?

sravaniprakash9 commented 5 years ago

Hi,

Thanks a lot.! I am able to use pre trained model.

huihuifan commented 5 years ago

great! I will close this issue.