Closed james-vincent closed 2 years ago
Hi @james-vincent ,
It seems the version of tensorflow you're using is not supported anymore (https://github.com/huggingface/transformers/blob/master/setup.py#L155) . You need at least TF 2.3 to use transformers.
Are you able to upgrgade your dependency ?
Thanks for the quick reply.
I can change versions for everything. This is a standalone conda installation.
I have used tensorflow-gpu version 2.4.1 but now get a different error when running the test:
python -c "from transformers import pipeline; print(pipeline('sentiment-analysis')('I hate you'))"
[…]
Traceback (most recent call last):
File "
Installed versions of pertinent packages(via mamba):
cudatoolkit 9.0 h13b8566_0 anaconda cudnn 7.6.5 cuda9.0_0 anaconda
huggingface_hub 0.1.2 pyhd8ed1ab_0 conda-forge
keras-preprocessing 1.1.2 pypi_0 pypi
mamba 0.17.0 py36h05d92e0_0 conda-forge
nccl 1.3.5 cuda9.0_0 anaconda
numpy 1.19.5 py36hfc0c790_2 conda-forge
pip 21.3.1 pyhd8ed1ab_0 conda-forge
python 3.6.7 h357f687_1008_cpython conda-forge
pytorch 0.4.0 py36hdf912b8_0 anaconda
tensorflow-gpu 2.4.1 pypi_0 pypi
James Vincent, PhD Bioinformatics Software Curator Dept. of BCMP, Harvard Medical School — BioGrids.org --
On Nov 17, 2021, 3:48 AM -0500, Nicolas Patry @.***>, wrote:
Hi @james-vincent , It seems the version of tensorflow you're using is not supported anymore (https://github.com/huggingface/transformers/blob/master/setup.py#L155) . You need at least TF 2.3 to use transformers. Are you able to upgrgade your dependency ? — You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or unsubscribe. Triage notifications on the go with GitHub Mobile for iOS or Android.
Hi @james-vincent , seems like the error now lies in torch
import. Do you mind sharing the version you're running ? Maybe simply updating those dependencies should work.
Cheers,
Ah - thanks. I did not pay attention and install pytorch 0.4.0.
I updated to pytorch 1.10.0 and now the test passes just fine.
Thanks for the help.
I found that the conda recipe, both from conda-forge and huggingface channels, did not install tensorflow. This is why I did it manually. It would be great if the conda recipe had tensorflow but maybe there are other conflicts or considerations.
Thanks again, Jim
James Vincent, PhD Bioinformatics Software Curator Dept. of BCMP, Harvard Medical School — BioGrids.org --
On Nov 18, 2021, 3:37 AM -0500, Nicolas Patry @.***>, wrote:
Hi @james-vincent , seems like the error now lies in torch import. Do you mind sharing the version you're running ? Maybe simply updating those dependencies should work. Cheers, — You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or unsubscribe. Triage notifications on the go with GitHub Mobile for iOS or Android.
Hi @james-vincent I am not an expert in conda whatsoever, but yes, transformers being able to run EITHER torch
or tensorflow
or jax
independently, there are no hard requirements for either so we don't depend on ANY single one (even though without any of those dependencies, the library use is going to be very limited). You can also use all of them at the same time if you so desire.
Closing this, feel free to reopen if something was missed.
Hi @james-vincent ,
It seems the version of tensorflow you're using is not supported anymore (https://github.com/huggingface/transformers/blob/master/setup.py#L155) . You need at least TF 2.3 to use transformers.
Are you able to upgrgade your dependency ?
I met the same problem, and upgraded tf2.1 to tf2.3. It solved. Thank you. @Narsil
Environment info
Output of transformers-cli env is an error ending with:
RuntimeError: Failed to import transformers.modeling_tf_utils because of the following error (look up to see its traceback): No module named 'tensorflow.python.keras.engine.keras_tensor'
transformers
version:Who can help
Library:
To reproduce
Steps to reproduce the behavior:
Installation with Mamba using conda recipe for transformers:
micromamba create -y -p <path> mamba python=3.6 cudatoolkit=10.0 cudnn=7.6.0 pytorch micromamba install -y -p <path> pandas seaborn plotly bokeh scikit-learn statsmodels scipy matplotlib simpleitk -c simpleitk micromamba install -y -p <path> transformers=4.12.3 source <path>/bin/activate base python -m pip install --upgrade pip python -m pip install tensorflow-gpu==2.1.0
Output of sample given in installation docs:
./python -c "from transformers import pipeline; print(pipeline('sentiment-analysis')('I hate you'))"
`Traceback (most recent call last): File "/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/site-packages/transformers/file_utils.py", line 2150, in _get_module return importlib.import_module("." + module_name, self.name) File "/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/importlib/init.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "", line 994, in _gcd_import
File "", line 971, in _find_and_load
File "", line 955, in _find_and_load_unlocked
File "", line 665, in _load_unlocked
File "", line 678, in exec_module
File "", line 219, in _call_with_frames_removed
File "/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/site-packages/transformers/modeling_tf_utils.py", line 30, in
from tensorflow.python.keras.engine.keras_tensor import KerasTensor
ModuleNotFoundError: No module named 'tensorflow.python.keras.engine.keras_tensor'
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/site-packages/transformers/file_utils.py", line 2150, in _get_module return importlib.import_module("." + module_name, self.name) File "/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/importlib/init.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "", line 994, in _gcd_import
File "", line 971, in _find_and_load
File "", line 955, in _find_and_load_unlocked
File "", line 665, in _load_unlocked
File "", line 678, in exec_module
File "", line 219, in _call_with_frames_removed
File "/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/site-packages/transformers/pipelines/init.py", line 25, in
from ..models.auto.configuration_auto import AutoConfig
File "/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/site-packages/transformers/models/init.py", line 19, in
from . import (
File "/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/site-packages/transformers/models/layoutlm/init.py", line 22, in
from .configuration_layoutlm import LAYOUTLM_PRETRAINED_CONFIG_ARCHIVE_MAP, LayoutLMConfig
File "/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/site-packages/transformers/models/layoutlm/configuration_layoutlm.py", line 22, in
from ...onnx import OnnxConfig, PatchingSpec
File "/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/site-packages/transformers/onnx/init.py", line 17, in
from .convert import export, validate_model_outputs
File "/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/site-packages/transformers/onnx/convert.py", line 23, in
from .. import PreTrainedModel, PreTrainedTokenizer, TensorType, TFPreTrainedModel, is_torch_available
File "", line 1020, in _handle_fromlist
File "/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/site-packages/transformers/file_utils.py", line 2140, in getattr
module = self._get_module(self._class_to_module[name])
File "/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/site-packages/transformers/file_utils.py", line 2154, in _get_module
) from e
RuntimeError: Failed to import transformers.modeling_tf_utils because of the following error (look up to see its traceback):
No module named 'tensorflow.python.keras.engine.keras_tensor'
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "", line 1, in
File "", line 1020, in _handle_fromlist
File "/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/site-packages/transformers/file_utils.py", line 2140, in getattr
module = self._get_module(self._class_to_module[name])
File "/programs/x86_64-linux/transformers/4.12.3_cu10.0/lib/python3.6/site-packages/transformers/file_utils.py", line 2154, in _get_module
) from e
RuntimeError: Failed to import transformers.pipelines because of the following error (look up to see its traceback):
Failed to import transformers.modeling_tf_utils because of the following error (look up to see its traceback):
No module named 'tensorflow.python.keras.engine.keras_tensor'`
Expected behavior
Expected model output ending with: [{'label': 'NEGATIVE', 'score': 0.9991129040718079}]