coli-saar / am-parser

Modular implementation of an AM dependency parser in AllenNLP.
Apache License 2.0
30 stars 10 forks source link

openssl error on train #99

Closed megodoonch closed 2 years ago

megodoonch commented 2 years ago

Hi all, I'm trying to get the parser working on my laptop, and I get an openssl error:

ImportError: libssl.so.3: cannot open shared object file: No such file or directory

I suspect I just have some kind of version incompatability. Maybe I just need a good Pip Freeze from someone who's got it running.

Here's the traceback:

Traceback (most recent call last):
  File "/home/mego/anaconda3/envs/am-parser/lib/python3.7/site-packages/transformers/utils/import_utils.py", line 857, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
  File "/home/mego/anaconda3/envs/am-parser/lib/python3.7/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
  File "<frozen importlib._bootstrap>", line 983, in _find_and_load
  File "<frozen importlib._bootstrap>", line 967, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 728, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/mego/anaconda3/envs/am-parser/lib/python3.7/site-packages/transformers/tokenization_utils.py", line 26, in <module>
    from .tokenization_utils_base import (
  File "/home/mego/anaconda3/envs/am-parser/lib/python3.7/site-packages/transformers/tokenization_utils_base.py", line 72, in <module>
    from tokenizers import AddedToken
  File "/home/mego/anaconda3/envs/am-parser/lib/python3.7/site-packages/tokenizers/__init__.py", line 79, in <module>
    from .tokenizers import (
ImportError: libssl.so.3: cannot open shared object file: No such file or directory

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "train.py", line 66, in <module>
    from allennlp.commands.subcommand import Subcommand
  File "/home/mego/anaconda3/envs/am-parser/lib/python3.7/site-packages/allennlp/commands/__init__.py", line 8, in <module>
    from allennlp.commands.build_vocab import BuildVocab
  File "/home/mego/anaconda3/envs/am-parser/lib/python3.7/site-packages/allennlp/commands/build_vocab.py", line 16, in <module>
    from allennlp.training.util import make_vocab_from_params
  File "/home/mego/anaconda3/envs/am-parser/lib/python3.7/site-packages/allennlp/training/__init__.py", line 2, in <module>
    from allennlp.training.no_op_trainer import NoOpTrainer
  File "/home/mego/anaconda3/envs/am-parser/lib/python3.7/site-packages/allennlp/training/no_op_trainer.py", line 6, in <module>
    from allennlp.models import Model
  File "/home/mego/anaconda3/envs/am-parser/lib/python3.7/site-packages/allennlp/models/__init__.py", line 6, in <module>
    from allennlp.models.model import Model
  File "/home/mego/anaconda3/envs/am-parser/lib/python3.7/site-packages/allennlp/models/model.py", line 18, in <module>
    from allennlp.data import Instance, Vocabulary
  File "/home/mego/anaconda3/envs/am-parser/lib/python3.7/site-packages/allennlp/data/__init__.py", line 1, in <module>
    from allennlp.data.data_loaders import (
  File "/home/mego/anaconda3/envs/am-parser/lib/python3.7/site-packages/allennlp/data/data_loaders/__init__.py", line 1, in <module>
    from allennlp.data.data_loaders.data_loader import DataLoader, TensorDict
  File "/home/mego/anaconda3/envs/am-parser/lib/python3.7/site-packages/allennlp/data/data_loaders/data_loader.py", line 6, in <module>
    from allennlp.data.instance import Instance
  File "/home/mego/anaconda3/envs/am-parser/lib/python3.7/site-packages/allennlp/data/instance.py", line 3, in <module>
    from allennlp.data.fields.field import DataArray, Field
  File "/home/mego/anaconda3/envs/am-parser/lib/python3.7/site-packages/allennlp/data/fields/__init__.py", line 6, in <module>
    from allennlp.data.fields.field import Field
  File "/home/mego/anaconda3/envs/am-parser/lib/python3.7/site-packages/allennlp/data/fields/field.py", line 6, in <module>
    from allennlp.data.vocabulary import Vocabulary
  File "/home/mego/anaconda3/envs/am-parser/lib/python3.7/site-packages/allennlp/data/vocabulary.py", line 14, in <module>
    from transformers import PreTrainedTokenizer
  File "<frozen importlib._bootstrap>", line 1032, in _handle_fromlist
  File "/home/mego/anaconda3/envs/am-parser/lib/python3.7/site-packages/transformers/utils/import_utils.py", line 847, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "/home/mego/anaconda3/envs/am-parser/lib/python3.7/site-packages/transformers/utils/import_utils.py", line 861, in _get_module
    ) from e
RuntimeError: Failed to import transformers.tokenization_utils because of the following error (look up to see its traceback):
libssl.so.3: cannot open shared object file: No such file or directory
megodoonch commented 2 years ago

Never mind, it seems to have been a version conflict. Using the Dockerfile from @tsimafeip's pull request fixed the issue.