Closed hzg0601 closed 2 years ago
Hi @hzg0601,
CogDL确实只依赖于PyTorch。这个error看起来是transformers这个依赖库导致的问题。可以列一下pip list出来的详细信息吗?
Hi @hzg0601,
CogDL确实只依赖于PyTorch。这个error看起来是transformers这个依赖库导致的问题。可以列一下pip list出来的详细信息吗?
pip list 如下: Version
absl-py 0.14.0 aiohttp 3.7.4.post0 alembic 1.7.5 astroid 2.6.6 astunparse 1.6.3 async-timeout 3.0.1 attrs 21.2.0 autopage 0.4.0 backcall 0.2.0 backports.entry-points-selectable 1.1.1 brotlipy 0.7.0 cachetools 4.2.2 certifi 2021.5.30 cffi 1.14.6 cfgv 3.3.1 chardet 4.0.0 charset-normalizer 2.0.0 click 8.0.3 cliff 3.10.0 cmaes 0.8.2 cmd2 2.3.3 cogdl 0.5.1.post1 colorama 0.4.4 colorlog 6.6.0 cryptography 3.4.7 cycler 0.10.0 Cython 0.29.24 debugpy 1.4.1 decorator 4.4.2 dgl 0.7.0 dill 0.3.4 distlib 0.3.4 fastdtw 0.3.4 filelock 3.4.0 flake8 4.0.1 fsspec 2021.9.0 future 0.18.2 gast 0.3.3 gensim 3.8.3 google-auth 1.35.0 google-auth-oauthlib 0.4.6 google-pasta 0.2.0 googledrivedownloader 0.4 grave 0.0.3 greenlet 1.1.2 grpcio 1.41.0 h5py 2.10.0 huggingface-hub 0.2.1 identify 2.4.0 idna 3.1 importlib-metadata 4.8.2 importlib-resources 5.4.0 ipykernel 6.2.0 ipython 7.26.0 ipython-genutils 0.2.0 isort 5.9.3 jedi 0.18.0 Jinja2 3.0.1 joblib 1.0.1 jupyter-client 6.1.12 jupyter-core 4.7.1 Keras-Preprocessing 1.1.2 kiwisolver 1.3.1 lazy-object-proxy 1.6.0 littleutils 0.2.2 llvmlite 0.37.0 Mako 1.1.6 Markdown 3.3.4 MarkupSafe 2.0.1 matplotlib 3.4.2 matplotlib-inline 0.1.2 mccabe 0.6.1 mkl-fft 1.3.0 mkl-random 1.2.2 mkl-service 2.4.0 multidict 5.1.0 networkx 2.6.2 ninja 1.10.2.3 nodeenv 1.6.0 numba 0.54.1 numpy 1.20.3 oauthlib 3.1.1 ogb 1.3.1 olefile 0.46 opt-einsum 3.3.0 optuna 2.4.0 outdated 0.2.1 packaging 21.0 pandas 1.3.1 parso 0.8.2 pbr 5.8.0 pexpect 4.8.0 pickleshare 0.7.5 Pillow 8.3.1 pip 21.2.2 platformdirs 2.4.0 pre-commit 2.16.0 prettytable 2.4.0 prompt-toolkit 3.0.17 protobuf 3.18.0 ptyprocess 0.7.0 pyasn1 0.4.8 pyasn1-modules 0.2.8 pycodestyle 2.8.0 pycparser 2.20 pyDeprecate 0.3.1 pyflakes 2.4.0 Pygments 2.10.0 pylint 2.9.6 pyOpenSSL 20.0.1 pyparsing 2.4.7 pyperclip 1.8.2 PySocks 1.7.1 python-dateutil 2.8.2 python-louvain 0.15 pytorch-lightning 1.4.8 pytz 2021.1 PyYAML 5.4.1 pyzmq 22.2.1 regex 2021.8.3 requests 2.26.0 requests-oauthlib 1.3.0 rsa 4.7.2 sacremoses 0.0.46 scikit-learn 0.24.2 scipy 1.4.1 sentencepiece 0.1.96 setuptools 52.0.0.post20210125 six 1.16.0 smart-open 5.2.1 SQLAlchemy 1.4.28 stevedore 3.5.0 tabulate 0.8.9 tensorboard 2.2.2 tensorboard-data-server 0.6.1 tensorboard-plugin-wit 1.8.0 tensorflow-estimator 2.2.0 tensorflow-gpu 2.2.0 termcolor 1.1.0 threadpoolctl 2.2.0 tokenizers 0.10.3 toml 0.10.2 torch 1.8.1 torch-cluster 1.5.9 torch-geometric 1.7.2 torch-scatter 2.0.8 torch-sparse 0.6.11 torch-spline-conv 1.2.1 torchaudio 0.8.0a0+e4e171a torchmetrics 0.5.1 torchvision 0.9.1 tornado 6.1 tqdm 4.62.0 traitlets 5.0.5 transformers 4.13.0 typing-extensions 3.10.0.0 urllib3 1.26.6 virtualenv 20.10.0 wcwidth 0.2.5 Werkzeug 2.0.1 wheel 0.36.2 wrapt 1.12.1 yarl 1.6.3 zipp 3.6.0 我是用Pip 安装的cogDL, 依赖包也是pip安装的。
Hi @hzg0601 , 我猜测是你的环境中有tensorflow,导致transformers会去import tensorflow相关的内容。有没有可能新配一个pytorch的环境,然后安装cogdl试一下~
Hi @hzg0601 , 我猜测是你的环境中有tensorflow,导致transformers会去import tensorflow相关的内容。有没有可能新配一个pytorch的环境,然后安装cogdl试一下~
我是要调用embedding API,不需要用oagber,所以我把根目录的init.py的调用信息,pipeline.py里关于ogaber里的调用信息和类删掉也能运行。
不过我感觉很多人不可避免地要同时安装pytorch和tensorflow两个框架,后期是否准备修复这个问题?
正在准备修复这个
期待。。
加载cogDL出错,错误信息如下: cogDL 不是基于pytorch框架吗?为什么会出现tensorflow报错? 如何直接绕开调用tensorflow?
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "/home/huangzhiguo/.conda/envs/huangzg_sggd/lib/python3.8/site-packages/transformers/file_utils.py", line 2281, in _get_module return importlib.import_module("." + module_name, self.name) File "/home/huangzhiguo/.conda/envs/huangzg_sggd/lib/python3.8/importlib/init.py", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "", line 1014, in _gcd_import
File "", line 991, in _find_and_load
File "", line 961, in _find_and_load_unlocked
File "", line 219, in _call_with_frames_removed
File "", line 1014, in _gcd_import
File "", line 991, in _find_and_load
File "", line 975, in _find_and_load_unlocked
File "", line 671, in _load_unlocked
File "", line 843, in exec_module
File "", line 219, in _call_with_frames_removed
File "/home/huangzhiguo/.conda/envs/huangzg_sggd/lib/python3.8/site-packages/transformers/models/init.py", line 19, in
from . import (
File "/home/huangzhiguo/.conda/envs/huangzg_sggd/lib/python3.8/site-packages/transformers/models/layoutlm/init.py", line 22, in
from .configuration_layoutlm import LAYOUTLM_PRETRAINED_CONFIG_ARCHIVE_MAP, LayoutLMConfig
File "/home/huangzhiguo/.conda/envs/huangzg_sggd/lib/python3.8/site-packages/transformers/models/layoutlm/configuration_layoutlm.py", line 22, in
from ...onnx import OnnxConfig, PatchingSpec
File "/home/huangzhiguo/.conda/envs/huangzg_sggd/lib/python3.8/site-packages/transformers/onnx/init.py", line 17, in
from .convert import export, validate_model_outputs
File "/home/huangzhiguo/.conda/envs/huangzg_sggd/lib/python3.8/site-packages/transformers/onnx/convert.py", line 23, in
from transformers import PreTrainedModel, PreTrainedTokenizer, TensorType, TFPreTrainedModel, is_torch_available
File "", line 1039, in _handle_fromlist
File "/home/huangzhiguo/.conda/envs/huangzg_sggd/lib/python3.8/site-packages/transformers/file_utils.py", line 2271, in getattr
module = self._get_module(self._class_to_module[name])
File "/home/huangzhiguo/.conda/envs/huangzg_sggd/lib/python3.8/site-packages/transformers/file_utils.py", line 2283, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.modeling_tf_utils because of the following error (look up to see its traceback):
No module named 'tensorflow.python.keras.engine.keras_tensor'
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "", line 1, in
File "/home/huangzhiguo/.conda/envs/huangzg_sggd/lib/python3.8/site-packages/cogdl/init.py", line 4, in
from .oag import oagbert
File "/home/huangzhiguo/.conda/envs/huangzg_sggd/lib/python3.8/site-packages/cogdl/oag/init.py", line 1, in
from .oagbert import oagbert
File "/home/huangzhiguo/.conda/envs/huangzg_sggd/lib/python3.8/site-packages/cogdl/oag/oagbert.py", line 6, in
from transformers import BertTokenizer
File "", line 1039, in _handle_fromlist
File "/home/huangzhiguo/.conda/envs/huangzg_sggd/lib/python3.8/site-packages/transformers/file_utils.py", line 2271, in getattr
module = self._get_module(self._class_to_module[name])
File "/home/huangzhiguo/.conda/envs/huangzg_sggd/lib/python3.8/site-packages/transformers/file_utils.py", line 2283, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.models.bert because of the following error (look up to see its traceback):
Failed to import transformers.modeling_tf_utils because of the following error (look up to see its traceback):
No module named 'tensorflow.python.keras.engine.keras_tensor'