huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
135.82k stars 27.19k forks source link

libssl.so.10: cannot open shared object file: No such file or directory #21805

Closed falconair closed 1 year ago

falconair commented 1 year ago

System Info

I am setting up a brand new machine with Ubuntu 22.04, pytorch 1.13.1/pytorch-cuda 11.7 and transformers 4.24.0

Who can help?

No response

Information

Tasks

Reproduction

I installed transformers using the following command, as suggested by huggingface docs:

conda install -c huggingface transformers --y

I'm running the following command: from transformers import pipeline

I'm getting the following exception:

---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
~/anaconda3/lib/python3.9/site-packages/transformers/utils/import_utils.py in _get_module(self, module_name)
   1075         try:
-> 1076             return importlib.import_module("." + module_name, self.__name__)
   1077         except Exception as e:

~/anaconda3/lib/python3.9/importlib/__init__.py in import_module(name, package)
    126             level += 1
--> 127     return _bootstrap._gcd_import(name[level:], package, level)
    128 

~/anaconda3/lib/python3.9/importlib/_bootstrap.py in _gcd_import(name, package, level)

~/anaconda3/lib/python3.9/importlib/_bootstrap.py in _find_and_load(name, import_)

~/anaconda3/lib/python3.9/importlib/_bootstrap.py in _find_and_load_unlocked(name, import_)

~/anaconda3/lib/python3.9/importlib/_bootstrap.py in _load_unlocked(spec)

~/anaconda3/lib/python3.9/importlib/_bootstrap_external.py in exec_module(self, module)

~/anaconda3/lib/python3.9/importlib/_bootstrap.py in _call_with_frames_removed(f, *args, **kwds)

~/anaconda3/lib/python3.9/site-packages/transformers/pipelines/__init__.py in <module>
     32 from ..feature_extraction_utils import PreTrainedFeatureExtractor
---> 33 from ..models.auto.configuration_auto import AutoConfig
     34 from ..models.auto.feature_extraction_auto import FEATURE_EXTRACTOR_MAPPING, AutoFeatureExtractor

~/anaconda3/lib/python3.9/site-packages/transformers/models/__init__.py in <module>
     18 
---> 19 from . import (
     20     albert,

~/anaconda3/lib/python3.9/site-packages/transformers/models/mt5/__init__.py in <module>
     39 if is_tokenizers_available():
---> 40     from ..t5.tokenization_t5_fast import T5TokenizerFast
     41 else:

~/anaconda3/lib/python3.9/site-packages/transformers/models/t5/tokenization_t5_fast.py in <module>
     22 
---> 23 from ...tokenization_utils_fast import PreTrainedTokenizerFast
     24 from ...utils import is_sentencepiece_available, logging

~/anaconda3/lib/python3.9/site-packages/transformers/tokenization_utils_fast.py in <module>
     24 
---> 25 import tokenizers.pre_tokenizers as pre_tokenizers_fast
     26 from tokenizers import Encoding as EncodingFast

~/anaconda3/lib/python3.9/site-packages/tokenizers/__init__.py in <module>
     78 
---> 79 from .tokenizers import (
     80     Tokenizer,

ImportError: libssl.so.10: cannot open shared object file: No such file or directory

The above exception was the direct cause of the following exception:

RuntimeError                              Traceback (most recent call last)
/tmp/ipykernel_121111/4287807559.py in <module>
----> 1 from transformers import pipeline

~/anaconda3/lib/python3.9/importlib/_bootstrap.py in _handle_fromlist(module, fromlist, import_, recursive)

~/anaconda3/lib/python3.9/site-packages/transformers/utils/import_utils.py in __getattr__(self, name)
   1064             value = self._get_module(name)
   1065         elif name in self._class_to_module.keys():
-> 1066             module = self._get_module(self._class_to_module[name])
   1067             value = getattr(module, name)
   1068         else:

~/anaconda3/lib/python3.9/site-packages/transformers/utils/import_utils.py in _get_module(self, module_name)
   1076             return importlib.import_module("." + module_name, self.__name__)
   1077         except Exception as e:
-> 1078             raise RuntimeError(
   1079                 f"Failed to import {self.__name__}.{module_name} because of the following error (look up to see its"
   1080                 f" traceback):\n{e}"

RuntimeError: Failed to import transformers.pipelines because of the following error (look up to see its traceback):
libssl.so.10: cannot open shared object file: No such file or directory
​

Expected behavior

Please note that I'm running the official install instructions on a brand new machine!

There are two other tickets with the same issue: https://github.com/huggingface/transformers/issues/18549 https://github.com/huggingface/transformers/issues/19844

Both are closed because the user simply switched to using pip. But the problem remains with conda installs.

This error also resolves for me if I use pip install transformers --force-reinstall.

sgugger commented 1 year ago

This is not a library used by Transformers per se but Python. There is something wrong with your Python install via Conda, Python installed like this does not find the libssl.so.10 library.

KatarinaYuan commented 1 year ago

I meet the exact same issue here while pip install cannot solve the problem.

silasalves commented 1 year ago

tl;dr; conda update tokenizers solved the problem for me.


I think I had the same problem and this is how I solved it.

I noticed that the error was related to the Tokenizers package:

from .tokenizers import (
ImportError: /lib/x86_64-linux-gnu/libssl.so.10: version `libssl.so.10' not found (required by /home/silas/miniconda3/envs/llama/lib/python3.8/site-packages/tokenizers/tokenizers.cpython-38-x86_64-linux-gnu.so)

So I decided to check who was providing this library and if I was using the latest version. PyPi shows that the latest version is 0.13.02 and the library is by Hugging Face (so we are in the right place LOL).

After running conda list, I saw that I was using version 0.13.0.dev0. So I checked Conda-Forge and found that they had the new version. Then I ran conda update tokenizers and that solved the problem for me.

I hope that solves the problem for you. =)

Deewens commented 1 year ago

I have the exact same issues after I used conda to install transformers. Pip is working fine, however.

lilyq commented 1 year ago

My tokenizer version is 0.13.0.dev0, but conda update tokenizers doesn't work for me. I also tried conda install -c conda-forge tokenizers on Conda-Forge, it doesn't work either. How can I update the tokenizers version?

Vatshank commented 1 year ago

@lilyq I had the same issue. I uninstalled transformers/tokenizers first and then pip reinstalled from source using pip install git+https://github.com/huggingface/transformers (all within my conda env). This installed the right version of tokenizers as a dependency and now it works.

github-actions[bot] commented 1 year ago

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.

darendi commented 1 year ago

conda update tokenizers worked great for me, thank you

github-actions[bot] commented 1 year ago

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.