facebookresearch / fairseq

Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
MIT License
30.31k stars 6.39k forks source link

cannot import name 'libbleu', encountered when running torch hub translation example #2427

Closed blmoistawinde closed 4 years ago

blmoistawinde commented 4 years ago

🐛 Bug

I am witnessing ImportError: cannot import name 'libbleu', when running this torch hub translation example on Google Colab

To Reproduce

run the second code snippet of the notebook. And the error occurs at the torch.hub.load line.

Error message image

Environment

alexeib commented 4 years ago

have you tried running "pip install --editable ." like the error message says ?

HuipengXu commented 4 years ago

have you tried running "pip install --editable ." like the error message says ?

I have run "pip install --editable ." successfully, but still get the same error ):

XuhXie commented 4 years ago

I meet the same problem ):

blmoistawinde commented 4 years ago

A temporary solution found

Though the solution works, the existence of problem is obvious. I think the problem comes from the fairseq version that torch.hub uses (https://github.com/pytorch/fairseq/archive/master.zip). Hope that can be solved.

Below is how I found the problem.

Colab test

Tried with pip install --editable, the error was not solved, and another error ModuleNotFoundError: No module named 'fairseq.hub_utils' occurred for the previous code

image

image

image

Suprisingly, when I restart the kernel. The errors are gone.

image

another test

However, the "solution" on Colab still can't help me solve my previous similar problem on my Linux server.

The situation on my server is: I can use torch.hub.load normally in my python cmdline. However, when I run python xx.py in bash that contains the torch.hub.load line, it will fail.

Steps:

IPython 7.16.1 -- An enhanced Interactive Python. Type '?' for help.

In [1]: !python backtranslation/back_translation_server.py                                                                    
Torch version: 1.5.0
Starting to load English to German Translation Model:
Using cache found in /home/XXX/.cache/torch/hub/pytorch_fairseq_master
ERROR: missing libbleu.so. run `pip install --editable .`
Traceback (most recent call last):
  File "backtranslation/back_translation_server.py", line 13, in <module>
    en2de = torch.hub.load('pytorch/fairseq','transformer.wmt16.en-de', tokenizer='moses', bpe='subword_nmt')
  File "/home/XXX/.conda/envs/synqg/lib/python3.6/site-packages/torch/hub.py", line 365, in load
    hub_module = import_module(MODULE_HUBCONF, repo_dir + '/' + MODULE_HUBCONF)
  File "/home/XXX/.conda/envs/synqg/lib/python3.6/site-packages/torch/hub.py", line 75, in import_module
    spec.loader.exec_module(module)
  File "<frozen importlib._bootstrap_external>", line 678, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/XXX/.cache/torch/hub/pytorch_fairseq_master/hubconf.py", line 8, in <module>
    from fairseq.hub_utils import BPEHubInterface as bpe  # noqa
  File "/home/XXX/.cache/torch/hub/pytorch_fairseq_master/fairseq/__init__.py", line 18, in <module>
    import fairseq.models  # noqa
  File "/home/XXX/.cache/torch/hub/pytorch_fairseq_master/fairseq/models/__init__.py", line 132, in <module>
    module = importlib.import_module('fairseq.models.' + model_name)
  File "/home/XXX/.conda/envs/synqg/lib/python3.6/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "/home/XXX/.cache/torch/hub/pytorch_fairseq_master/fairseq/models/transformer_from_pretrained_xlm.py", line 12, in <module>
    from fairseq.models.transformer import (
  File "/home/XXX/.cache/torch/hub/pytorch_fairseq_master/fairseq/models/transformer.py", line 11, in <module>
    from fairseq import options, utils
  File "/home/XXX/.cache/torch/hub/pytorch_fairseq_master/fairseq/options.py", line 12, in <module>
    from fairseq import scoring, utils
  File "/home/XXX/.cache/torch/hub/pytorch_fairseq_master/fairseq/scoring/__init__.py", line 22, in <module>
    importlib.import_module("fairseq.scoring." + module)
  File "/home/XXX/.conda/envs/synqg/lib/python3.6/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "/home/XXX/.cache/torch/hub/pytorch_fairseq_master/fairseq/scoring/bleu.py", line 18, in <module>
    raise e
  File "/home/XXX/.cache/torch/hub/pytorch_fairseq_master/fairseq/scoring/bleu.py", line 13, in <module>
    from fairseq import libbleu
ImportError: cannot import name 'libbleu'
In [2]: from fairseq import libbleu                                                                                           

In [3]: import torch                                                                                                          

In [4]: torch.hub.list('pytorch/fairseq')                                                                                     
Using cache found in /home/XXX/.cache/torch/hub/pytorch_fairseq_master
Out[4]: 
['bart.base',
 'bart.large',
 'bart.large.cnn',
 'bart.large.mnli',
 'bart.large.xsum',
 'bpe',
 'camembert',
 'camembert-base',
 'camembert-base-ccnet',
 'camembert-base-ccnet-4gb',
 'camembert-base-oscar-4gb',
 'camembert-base-wikipedia-4gb',
 'camembert-large',
 'camembert.v0',
 'conv.stories',
 'conv.stories.pretrained',
 'conv.wmt14.en-de',
 'conv.wmt14.en-fr',
 'conv.wmt17.en-de',
 'data.stories',
 'dynamicconv.glu.wmt14.en-fr',
 'dynamicconv.glu.wmt16.en-de',
 'dynamicconv.glu.wmt17.en-de',
 'dynamicconv.glu.wmt17.zh-en',
 'dynamicconv.no_glu.iwslt14.de-en',
 'dynamicconv.no_glu.wmt16.en-de',
 'lightconv.glu.wmt14.en-fr',
 'lightconv.glu.wmt16.en-de',
 'lightconv.glu.wmt17.en-de',
 'lightconv.glu.wmt17.zh-en',
 'lightconv.no_glu.iwslt14.de-en',
 'lightconv.no_glu.wmt16.en-de',
 'roberta.base',
 'roberta.large',
 'roberta.large.mnli',
 'roberta.large.wsc',
 'tokenizer',
 'transformer.wmt14.en-fr',
 'transformer.wmt16.en-de',
 'transformer.wmt18.en-de',
 'transformer.wmt19.de-en',
 'transformer.wmt19.de-en.single_model',
 'transformer.wmt19.en-de',
 'transformer.wmt19.en-de.single_model',
 'transformer.wmt19.en-ru',
 'transformer.wmt19.en-ru.single_model',
 'transformer.wmt19.ru-en',
 'transformer.wmt19.ru-en.single_model',
 'transformer_lm.gbw.adaptive_huge',
 'transformer_lm.wiki103.adaptive',
 'transformer_lm.wmt19.de',
 'transformer_lm.wmt19.en',
 'transformer_lm.wmt19.ru',
 'xlmr.base',
 'xlmr.large']

In [8]: en2de = torch.hub.load('pytorch/fairseq','transformer.wmt16.en-de', tokenizer='moses', bpe='subword_nmt')             
Using cache found in /home/XXX/.cache/torch/hub/pytorch_fairseq_master
  0%|                                                                       | 1029120/2193287384 [00:05<3:21:40, 181167.62B/s

On my server, fairseq was install by pip install --editable ., cloned from the master branch.

I notice torch.hub uses a separatedly downloaded fairseq.

Downloading: "https://github.com/pytorch/fairseq/archive/master.zip" to /home/XXX/.cache/torch/hub/pytorch_fairseq_master

Then I copy the files from the git cloned fairseq, and replace the files in torch.hub cache /home/XXX/.cache/torch/hub/pytorch_fairseq_master, and the problem is gone!

HuipengXu commented 4 years ago

A temporary solution found

  • on colab:

    • clone this repo and pip install --editable .
    • restart the kernel
  • on Linux server:

    • copy the files from the git cloned fairseq, and replace the files in torch.hub cache, with directory look like /home/{username}/.cache/torch/hub/pytorch_fairseq_master

Though the solution works, the existence of problem is obvious. I think the problem comes from the fairseq version that torch.hub uses (https://github.com/pytorch/fairseq/archive/master.zip). Hope that can be solved.

Below is how I found the problem.

Colab test

Tried with pip install --editable, the error was not solved, and another error ModuleNotFoundError: No module named 'fairseq.hub_utils' occurred for the previous code

image

image

image

Suprisingly, when I restart the kernel. The errors are gone.

image

another test

However, the "solution" on Colab still can't help me solve my previous similar problem on my Linux server.

The situation on my server is: I can use torch.hub.load normally in my python cmdline. However, when I run python xx.py in bash that contains the torch.hub.load line, it will fail.

Steps:

  • The test environment is ipython
  • run python backtranslation/back_translation_server.py that contains torch.hub.load('pytorch/fairseq','transformer.wmt16.en-de', tokenizer='moses', bpe='subword_nmt')
IPython 7.16.1 -- An enhanced Interactive Python. Type '?' for help.

In [1]: !python backtranslation/back_translation_server.py                                                                    
Torch version: 1.5.0
Starting to load English to German Translation Model:
Using cache found in /home/XXX/.cache/torch/hub/pytorch_fairseq_master
ERROR: missing libbleu.so. run `pip install --editable .`
Traceback (most recent call last):
  File "backtranslation/back_translation_server.py", line 13, in <module>
    en2de = torch.hub.load('pytorch/fairseq','transformer.wmt16.en-de', tokenizer='moses', bpe='subword_nmt')
  File "/home/XXX/.conda/envs/synqg/lib/python3.6/site-packages/torch/hub.py", line 365, in load
    hub_module = import_module(MODULE_HUBCONF, repo_dir + '/' + MODULE_HUBCONF)
  File "/home/XXX/.conda/envs/synqg/lib/python3.6/site-packages/torch/hub.py", line 75, in import_module
    spec.loader.exec_module(module)
  File "<frozen importlib._bootstrap_external>", line 678, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/XXX/.cache/torch/hub/pytorch_fairseq_master/hubconf.py", line 8, in <module>
    from fairseq.hub_utils import BPEHubInterface as bpe  # noqa
  File "/home/XXX/.cache/torch/hub/pytorch_fairseq_master/fairseq/__init__.py", line 18, in <module>
    import fairseq.models  # noqa
  File "/home/XXX/.cache/torch/hub/pytorch_fairseq_master/fairseq/models/__init__.py", line 132, in <module>
    module = importlib.import_module('fairseq.models.' + model_name)
  File "/home/XXX/.conda/envs/synqg/lib/python3.6/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "/home/XXX/.cache/torch/hub/pytorch_fairseq_master/fairseq/models/transformer_from_pretrained_xlm.py", line 12, in <module>
    from fairseq.models.transformer import (
  File "/home/XXX/.cache/torch/hub/pytorch_fairseq_master/fairseq/models/transformer.py", line 11, in <module>
    from fairseq import options, utils
  File "/home/XXX/.cache/torch/hub/pytorch_fairseq_master/fairseq/options.py", line 12, in <module>
    from fairseq import scoring, utils
  File "/home/XXX/.cache/torch/hub/pytorch_fairseq_master/fairseq/scoring/__init__.py", line 22, in <module>
    importlib.import_module("fairseq.scoring." + module)
  File "/home/XXX/.conda/envs/synqg/lib/python3.6/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "/home/XXX/.cache/torch/hub/pytorch_fairseq_master/fairseq/scoring/bleu.py", line 18, in <module>
    raise e
  File "/home/XXX/.cache/torch/hub/pytorch_fairseq_master/fairseq/scoring/bleu.py", line 13, in <module>
    from fairseq import libbleu
ImportError: cannot import name 'libbleu'
  • However, I can run this line without any problem directly in ipython with the same env
In [2]: from fairseq import libbleu                                                                                           

In [3]: import torch                                                                                                          

In [4]: torch.hub.list('pytorch/fairseq')                                                                                     
Using cache found in /home/XXX/.cache/torch/hub/pytorch_fairseq_master
Out[4]: 
['bart.base',
 'bart.large',
 'bart.large.cnn',
 'bart.large.mnli',
 'bart.large.xsum',
 'bpe',
 'camembert',
 'camembert-base',
 'camembert-base-ccnet',
 'camembert-base-ccnet-4gb',
 'camembert-base-oscar-4gb',
 'camembert-base-wikipedia-4gb',
 'camembert-large',
 'camembert.v0',
 'conv.stories',
 'conv.stories.pretrained',
 'conv.wmt14.en-de',
 'conv.wmt14.en-fr',
 'conv.wmt17.en-de',
 'data.stories',
 'dynamicconv.glu.wmt14.en-fr',
 'dynamicconv.glu.wmt16.en-de',
 'dynamicconv.glu.wmt17.en-de',
 'dynamicconv.glu.wmt17.zh-en',
 'dynamicconv.no_glu.iwslt14.de-en',
 'dynamicconv.no_glu.wmt16.en-de',
 'lightconv.glu.wmt14.en-fr',
 'lightconv.glu.wmt16.en-de',
 'lightconv.glu.wmt17.en-de',
 'lightconv.glu.wmt17.zh-en',
 'lightconv.no_glu.iwslt14.de-en',
 'lightconv.no_glu.wmt16.en-de',
 'roberta.base',
 'roberta.large',
 'roberta.large.mnli',
 'roberta.large.wsc',
 'tokenizer',
 'transformer.wmt14.en-fr',
 'transformer.wmt16.en-de',
 'transformer.wmt18.en-de',
 'transformer.wmt19.de-en',
 'transformer.wmt19.de-en.single_model',
 'transformer.wmt19.en-de',
 'transformer.wmt19.en-de.single_model',
 'transformer.wmt19.en-ru',
 'transformer.wmt19.en-ru.single_model',
 'transformer.wmt19.ru-en',
 'transformer.wmt19.ru-en.single_model',
 'transformer_lm.gbw.adaptive_huge',
 'transformer_lm.wiki103.adaptive',
 'transformer_lm.wmt19.de',
 'transformer_lm.wmt19.en',
 'transformer_lm.wmt19.ru',
 'xlmr.base',
 'xlmr.large']

In [8]: en2de = torch.hub.load('pytorch/fairseq','transformer.wmt16.en-de', tokenizer='moses', bpe='subword_nmt')             
Using cache found in /home/XXX/.cache/torch/hub/pytorch_fairseq_master
  0%|                                                                       | 1029120/2193287384 [00:05<3:21:40, 181167.62B/s

On my server, fairseq was install by pip install --editable ., cloned from the master branch.

I notice torch.hub uses a separatedly downloaded fairseq.

Downloading: "https://github.com/pytorch/fairseq/archive/master.zip" to /home/XXX/.cache/torch/hub/pytorch_fairseq_master

Then I copy the files from the git cloned fairseq, and replace the files in torch.hub cache /home/XXX/.cache/torch/hub/pytorch_fairseq_master, and the problem is gone!

thanks, it solved my problem

HuipengXu commented 4 years ago

image

another problem :joy:

myleott commented 4 years ago

Fixed by 6f9ed78059f67ce0bbc107f513064a931f551392