Closed amyeroberts closed 7 months ago
@dangkhoasdc @Keneyr @stevhliu Thanks for reporting on the PR. I've opened this issue to keep track of this.
The behaviour is very weird, as the offending line is guarded by the pytorch version, so shouldn't be run at all.
@dangkhoasdc @Keneyr Could you share the transformers version you're running and what device you're running on e.g. GPU?
I'm not able to reproduce running just on my mac:
- `transformers` version: 4.38.2
- Platform: macOS-14.2.1-arm64-arm-64bit
- Python version: 3.10.9
- Huggingface_hub version: 0.21.3
- Safetensors version: 0.4.2
- Accelerate version: 0.28.0
- Accelerate config: not found
- PyTorch version (GPU?): 1.11.0 (False)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): 0.7.0 (cpu)
- Jax version: 0.4.12
- JaxLib version: 0.4.12
I am not able to reproduce within several combination of envs (but I am only in python 3.8 and tf version is only up to 2.13). @amyeroberts is the env. in the description reproduces the issue?
@ydshieh Env in the issue's description is from @stevhliu cf this comment which replicated the issue.
Env in my comment here is what I ran, but didn't replicate the issue.
Hi, unfortunately I'm unable to reproduce your issue with torch==1.11.0 Here's the install environment I used:
IIRC, @stevhliu is unable to replicate the issue with the env posted
Let's wait the PR author's response.
@ydshieh D'oh, that's my reading comprehension, sorry, I misread as "I'm able to reproduce". I'd say let's just close the issue then - it's not something we're seeing reported elsewhere and not something we can replicate.
Hi I have the same issue with on Oracle Linux 9.3 with python3 compiled from source. We do have python 3.8-3.12 installed in envs modules if you need anything tested.
MWE:
python3 -c 'import transformers'
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/app/tmbed/1.0.0-3.10.13/transformers/__init__.py", line 26, in <module>
from . import dependency_versions_check
File "/app/tmbed/1.0.0-3.10.13/transformers/dependency_versions_check.py", line 16, in <module>
from .utils.versions import require_version, require_version_core
File "/app/tmbed/1.0.0-3.10.13/transformers/utils/__init__.py", line 33, in <module>
from .generic import (
File "/app/tmbed/1.0.0-3.10.13/transformers/utils/generic.py", line 455, in <module>
_torch_pytree.register_pytree_node(
AttributeError: module 'torch.utils._pytree' has no attribute 'register_pytree_node'. Did you mean: '_register_pytree_node'?
EDIT: I tried importing transformers
with the same python and pytorch versions and different transformers versions and got these results:
!conda update -n base conda -y !conda update python -y !pip3 install --upgrade pip setuptools wheel packaging !pip3 install --upgrade tensorflow torch pandas matplotlib nltk faiss-gpu ipywidgets einops !pip3 install --upgrade accelerate scipy langchain langchain-community datasets PyMuPDF !pip3 install --upgrade attention-sinks tiktoken sentence_transformers optimum auto-gptq !pip3 install transformers==4.40.0
AttributeError Traceback (most recent call last) Cell In[5], line 13 11 import scipy 12 import tiktoken ---> 13 from sentence_transformers import SentenceTransformer 14 from scipy.stats import fisher_exact 15 from torch import nn
File /opt/conda/lib/python3.10/site-packages/sentence_transformers/init.py:7 4 import importlib 5 import os ----> 7 from sentence_transformers.cross_encoder.CrossEncoder import CrossEncoder 8 from sentence_transformers.datasets import ParallelSentencesDataset, SentencesDataset 9 from sentence_transformers.LoggingHandler import LoggingHandler
File /opt/conda/lib/python3.10/site-packages/sentence_transformers/cross_encoder/init.py:1 ----> 1 from .CrossEncoder import CrossEncoder 3 all = ["CrossEncoder"]
File /opt/conda/lib/python3.10/site-packages/sentence_transformers/cross_encoder/CrossEncoder.py:12 10 from torch.utils.data import DataLoader 11 from tqdm.autonotebook import tqdm, trange ---> 12 from transformers import AutoConfig, AutoModelForSequenceClassification, AutoTokenizer, is_torch_npu_available 13 from transformers.tokenization_utils_base import BatchEncoding 14 from transformers.utils import PushToHubMixin
File /opt/conda/lib/python3.10/site-packages/transformers/init.py:26 23 from typing import TYPE_CHECKING 25 # Check the dependencies satisfy the minimal versions required. ---> 26 from . import dependency_versions_check 27 from .utils import ( 28 OptionalDependencyNotAvailable, 29 _LazyModule, (...) 48 logging, 49 ) 52 logger = logging.get_logger(name) # pylint: disable=invalid-name
File /opt/conda/lib/python3.10/site-packages/transformers/dependency_versions_check.py:16 1 # Copyright 2020 The HuggingFace Team. All rights reserved. 2 # 3 # Licensed under the Apache License, Version 2.0 (the "License"); (...) 12 # See the License for the specific language governing permissions and 13 # limitations under the License. 15 from .dependency_versions_table import deps ---> 16 from .utils.versions import require_version, require_version_core 19 # define which module versions we always want to check at run time 20 # (usually the ones defined in install_requires in setup.py) 21 # 22 # order specific notes: 23 # - tqdm must be checked before tokenizers 25 pkgs_to_check_at_runtime = [ 26 "python", 27 "tqdm", (...) 37 "pyyaml", 38 ]
File /opt/conda/lib/python3.10/site-packages/transformers/utils/init.py:33 24 from .constants import IMAGENET_DEFAULT_MEAN, IMAGENET_DEFAULT_STD, IMAGENET_STANDARD_MEAN, IMAGENET_STANDARD_STD 25 from .doc import ( 26 add_code_sample_docstrings, 27 add_end_docstrings, (...) 31 replace_return_docstrings, 32 ) ---> 33 from .generic import ( 34 ContextManagers, 35 ExplicitEnum, 36 ModelOutput, 37 PaddingStrategy, 38 TensorType, 39 add_model_info_to_auto_map, 40 cached_property, 41 can_return_loss, 42 expand_dims, 43 find_labels, 44 flatten_dict, 45 infer_framework, 46 is_jax_tensor, 47 is_numpy_array, 48 is_tensor, 49 is_tf_symbolic_tensor, 50 is_tf_tensor, 51 is_torch_device, 52 is_torch_dtype, 53 is_torch_tensor, 54 reshape, 55 squeeze, 56 strtobool, 57 tensor_size, 58 to_numpy, 59 to_py_obj, 60 transpose, 61 working_or_temp_dir, 62 ) 63 from .hub import ( 64 CLOUDFRONT_DISTRIB_PREFIX, 65 HF_MODULES_CACHE, (...) 91 try_to_load_from_cache, 92 ) 93 from .import_utils import ( 94 ACCELERATE_MIN_VERSION, 95 ENV_VARS_TRUE_AND_AUTO_VALUES, (...) 207 torch_only_method, 208 )
File /opt/conda/lib/python3.10/site-packages/transformers/utils/generic.py:478 475 return output_type(**dict(zip(context, values))) 477 if version.parse(get_torch_version()) >= version.parse("2.2"): --> 478 _torch_pytree.register_pytree_node( 479 ModelOutput, 480 _model_output_flatten, 481 partial(_model_output_unflatten, output_type=ModelOutput), 482 serialized_type_name=f"{ModelOutput.module}.{ModelOutput.name}", 483 ) 484 else: 485 _torch_pytree._register_pytree_node( 486 ModelOutput, 487 _model_output_flatten, 488 partial(_model_output_unflatten, output_type=ModelOutput), 489 )
AttributeError: module 'torch.utils._pytree' has no attribute 'register_pytree_node'
@Kuchiriel Could you please provide a reproducible code snippet and your running environment: run transformers-cli env
in the terminal and copy-paste the output?
Hmm, that line is guarded with if version.parse(get_torch_version()) >= version.parse("2.2")
.
Not sure if the specified version is incorrect. Let's wait @Kuchiriel reply.
System Info
One breaking env:
Who can help?
@ydshieh
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
Run
import transformers
on lower versions of pytorch.Will return the error:
Also report on torch 1.13
c.f. this PR: https://github.com/huggingface/transformers/pull/29364
Expected behavior
Officially supported torch versions work