datawhalechina / self-llm

《开源大模型食用指南》基于Linux环境快速部署开源大模型,更适合中国宝宝的部署教程
Apache License 2.0
8.24k stars 986 forks source link

Failed to import transformers.models.qwen2 #44

Closed lhtpluto closed 8 months ago

lhtpluto commented 8 months ago

RuntimeError: Failed to import transformers.models.qwen2.modeling_qwen2 because of the following error (look up to see its traceback): /root/anaconda3/envs/qwen1.5/lib/python3.11/site-packages/flash_attn_2_cuda.cpython-311-x86_64-linux-gnu.so: undefined symbol: _ZN2at4_ops9_pad_enum4callERKNS_6TensorEN3c108ArrayRefINS5_6SymIntEEElNS5_8optionalIdEE

KMnO4-zx commented 8 months ago

请详细描述你的问题,包括但不限于,在什么设备上运行的,运行的什么文件

lhtpluto commented 8 months ago

请详细描述你的问题,包括但不限于,在什么设备上运行的,运行的什么文件

WSL2

(qwen1.5) root@DESKTOP-AO2ANO7:/home/wsl/Qwen2# streamlit run Chatbot.py --server.address 127.0.0.1 --server.port 6006

You can now view your Streamlit app in your browser.

URL: http://127.0.0.1:6006

gio: http://127.0.0.1:6006: Operation not supported Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained. 2024-02-07 12:31:57.744 Uncaught app exception

(qwen1.5) root@DESKTOP-AO2ANO7:/home/wsl/Qwen2# pip list Package Version


absl-py 2.1.0 accelerate 0.24.1 aiofiles 23.2.1 aiohttp 3.9.3 aiosignal 1.3.1 altair 5.2.0 annotated-types 0.6.0 anyio 4.2.0 appdirs 1.4.4 attributedict 0.3.0 attrs 23.2.0 auto-gptq 0.6.0+cu121 autoawq 0.1.8 bitsandbytes 0.41.1 blessings 1.7 blinker 1.7.0 cachetools 5.3.2 certifi 2024.2.2 chardet 5.2.0 charset-normalizer 3.3.2 click 8.1.7 codecov 2.1.13 colorama 0.4.6 coloredlogs 15.0.1 colour-runner 0.1.1 contourpy 1.2.0 coverage 7.4.1 cramjam 2.8.1 ctransformers 0.2.27+cu121 cycler 0.12.1 DataProperty 1.0.1 datasets 2.16.1 deepdiff 6.7.1 dill 0.3.7 diskcache 5.6.3 distlib 0.3.8 distro 1.9.0 docker-pycreds 0.4.0 einops 0.7.0 exllamav2 0.0.12+cu121 fastapi 0.109.2 fastparquet 2023.10.1 ffmpy 0.3.1 filelock 3.13.1 flash-attn 2.5.0 fonttools 4.47.2 frozenlist 1.4.1 fsspec 2023.10.0 gekko 1.0.6 gitdb 4.0.11 GitPython 3.1.41 google-auth 2.27.0 google-auth-oauthlib 1.2.0 gptq-for-llama 0.1.1+cu121 gradio 3.50.2 gradio_client 0.6.1 grpcio 1.60.1 h11 0.14.0 hqq 0.1.2.post1 httpcore 1.0.2 httpx 0.26.0 huggingface-hub 0.20.3 humanfriendly 10.0 idna 3.6 importlib-metadata 6.11.0 importlib-resources 6.1.1 inspecta 0.1.3 Jinja2 3.1.2 joblib 1.3.2 jsonlines 4.0.0 jsonschema 4.21.1 jsonschema-specifications 2023.12.1 kiwisolver 1.4.5 llama_cpp_python 0.2.38+cpuavx2 llama_cpp_python_cuda 0.2.38+cu121 llama_cpp_python_cuda_tensorcores 0.2.38+cu121 lm-eval 0.3.0 Markdown 3.5.2 markdown-it-py 3.0.0 MarkupSafe 2.1.5 matplotlib 3.8.2 mbstrdecoder 1.1.3 mdurl 0.1.2 mpmath 1.3.0 multidict 6.0.5 multiprocess 0.70.15 networkx 3.2.1 ninja 1.11.1.1 nltk 3.8.1 numexpr 2.9.0 numpy 1.24.4 nvidia-cublas-cu12 12.1.3.1 nvidia-cuda-cupti-cu12 12.1.105 nvidia-cuda-nvrtc-cu12 12.1.105 nvidia-cuda-runtime-cu12 12.1.105 nvidia-cudnn-cu12 8.9.2.26 nvidia-cufft-cu12 11.0.2.54 nvidia-curand-cu12 10.3.2.106 nvidia-cusolver-cu12 11.4.5.107 nvidia-cusparse-cu12 12.1.0.106 nvidia-nccl-cu12 2.19.3 nvidia-nvjitlink-cu12 12.3.101 nvidia-nvtx-cu12 12.1.105 oauthlib 3.2.2 openai 1.11.1 optimum 1.16.2 ordered-set 4.1.0 orjson 3.9.13 packaging 23.2 pandas 2.2.0 pathvalidate 3.2.0 peft 0.7.1 Pillow 9.5.0 pip 23.3.1 platformdirs 4.2.0 pluggy 1.4.0 portalocker 2.8.2 protobuf 4.23.4 psutil 5.9.8 py-cpuinfo 9.0.0 pyarrow 15.0.0 pyarrow-hotfix 0.6 pyasn1 0.5.1 pyasn1-modules 0.3.0 pybind11 2.11.1 pycountry 23.12.11 pydantic 2.6.0 pydantic_core 2.16.1 pydeck 0.8.1b0 pydub 0.25.1 Pygments 2.17.2 Pympler 1.0.1 pyparsing 3.1.1 pyproject-api 1.6.1 pytablewriter 1.2.0 python-dateutil 2.8.2 python-multipart 0.0.7 pytz 2024.1 pytz-deprecation-shim 0.1.0.post0 PyYAML 6.0.1 referencing 0.33.0 regex 2023.12.25 requests 2.31.0 requests-oauthlib 1.3.1 rich 13.7.0 rootpath 0.1.1 rouge 1.0.1 rouge-score 0.1.2 rpds-py 0.17.1 rsa 4.9 sacrebleu 1.5.0 safetensors 0.4.2 scikit-learn 1.4.0 scipy 1.12.0 semantic-version 2.10.0 sentencepiece 0.1.99 sentry-sdk 1.40.0 setproctitle 1.3.3 setuptools 68.2.2 six 1.16.0 smmap 5.0.1 sniffio 1.3.0 some-package 0.1 sqlitedict 2.1.0 starlette 0.36.3 streamlit 1.24.0 sympy 1.12 tabledata 1.3.3 tabulate 0.9.0 tcolorpy 0.1.4 tenacity 8.2.3 tensorboard 2.15.1 tensorboard-data-server 0.7.2 termcolor 2.4.0 texttable 1.7.0 threadpoolctl 3.2.0 tiktoken 0.5.2 timm 0.9.12 tokenizers 0.15.1 toml 0.10.2 toolz 0.12.1 torch 2.2.0 torchaudio 2.2.0+cu121 torchvision 0.17.0 tornado 6.4 tox 4.12.1 tqdm 4.66.1 tqdm-multiprocess 0.0.11 transformers 4.37.0 transformers-stream-generator 0.0.4 triton 2.2.0 typepy 1.3.2 typing_extensions 4.9.0 tzdata 2023.4 tzlocal 4.3.1 urllib3 2.2.0 uvicorn 0.27.0.post1 validators 0.22.0 virtualenv 20.25.0 wandb 0.16.2 watchdog 4.0.0 websockets 11.0.3 Werkzeug 3.0.1 wheel 0.41.2 xxhash 3.4.1 yarl 1.9.4 zipp 3.17.0 zstandard 0.22.0

(qwen1.5) root@DESKTOP-AO2ANO7:/home/wsl/Qwen2# nvcc -V nvcc: NVIDIA (R) Cuda compiler driver Copyright (c) 2005-2023 NVIDIA Corporation Built on Wed_Nov_22_10:17:15_PST_2023 Cuda compilation tools, release 12.3, V12.3.107 Build cuda_12.3.r12.3/compiler.33567101_0

(qwen1.5) root@DESKTOP-AO2ANO7:/home/wsl/Qwen2# nvidia-smi Wed Feb 7 12:35:16 2024 +-----------------------------------------------------------------------------------------+ | NVIDIA-SMI 550.40.06 Driver Version: 551.23 CUDA Version: 12.4 | |-----------------------------------------+------------------------+----------------------+ | GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC | | Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. | | | | MIG M. | |=========================================+========================+======================| | 0 NVIDIA RTX 6000 Ada Gene... On | 00000000:34:00.0 Off | Off | | 40% 27C P8 11W / 300W | 23MiB / 49140MiB | 0% Default | | | | N/A | +-----------------------------------------+------------------------+----------------------+ | 1 NVIDIA RTX 6000 Ada Gene... On | 00000000:CA:00.0 Off | Off | | 41% 32C P8 16W / 300W | 223MiB / 49140MiB | 0% Default | | | | N/A | +-----------------------------------------+------------------------+----------------------+

KMnO4-zx commented 8 months ago

请详细描述你的问题,包括但不限于,在什么设备上运行的,运行的什么文件

WSL2

(qwen1.5) root@DESKTOP-AO2ANO7:/home/wsl/Qwen2# streamlit run Chatbot.py --server.address 127.0.0.1 --server.port 6006

You can now view your Streamlit app in your browser.

URL: http://127.0.0.1:6006

gio: http://127.0.0.1:6006: Operation not supported Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained. 2024-02-07 12:31:57.744 Uncaught app exception

(qwen1.5) root@DESKTOP-AO2ANO7:/home/wsl/Qwen2# pip list Package Version


absl-py 2.1.0 accelerate 0.24.1 aiofiles 23.2.1 aiohttp 3.9.3 aiosignal 1.3.1 altair 5.2.0 annotated-types 0.6.0 anyio 4.2.0 appdirs 1.4.4 attributedict 0.3.0 attrs 23.2.0 auto-gptq 0.6.0+cu121 autoawq 0.1.8 bitsandbytes 0.41.1 blessings 1.7 blinker 1.7.0 cachetools 5.3.2 certifi 2024.2.2 chardet 5.2.0 charset-normalizer 3.3.2 click 8.1.7 codecov 2.1.13 colorama 0.4.6 coloredlogs 15.0.1 colour-runner 0.1.1 contourpy 1.2.0 coverage 7.4.1 cramjam 2.8.1 ctransformers 0.2.27+cu121 cycler 0.12.1 DataProperty 1.0.1 datasets 2.16.1 deepdiff 6.7.1 dill 0.3.7 diskcache 5.6.3 distlib 0.3.8 distro 1.9.0 docker-pycreds 0.4.0 einops 0.7.0 exllamav2 0.0.12+cu121 fastapi 0.109.2 fastparquet 2023.10.1 ffmpy 0.3.1 filelock 3.13.1 flash-attn 2.5.0 fonttools 4.47.2 frozenlist 1.4.1 fsspec 2023.10.0 gekko 1.0.6 gitdb 4.0.11 GitPython 3.1.41 google-auth 2.27.0 google-auth-oauthlib 1.2.0 gptq-for-llama 0.1.1+cu121 gradio 3.50.2 gradio_client 0.6.1 grpcio 1.60.1 h11 0.14.0 hqq 0.1.2.post1 httpcore 1.0.2 httpx 0.26.0 huggingface-hub 0.20.3 humanfriendly 10.0 idna 3.6 importlib-metadata 6.11.0 importlib-resources 6.1.1 inspecta 0.1.3 Jinja2 3.1.2 joblib 1.3.2 jsonlines 4.0.0 jsonschema 4.21.1 jsonschema-specifications 2023.12.1 kiwisolver 1.4.5 llama_cpp_python 0.2.38+cpuavx2 llama_cpp_python_cuda 0.2.38+cu121 llama_cpp_python_cuda_tensorcores 0.2.38+cu121 lm-eval 0.3.0 Markdown 3.5.2 markdown-it-py 3.0.0 MarkupSafe 2.1.5 matplotlib 3.8.2 mbstrdecoder 1.1.3 mdurl 0.1.2 mpmath 1.3.0 multidict 6.0.5 multiprocess 0.70.15 networkx 3.2.1 ninja 1.11.1.1 nltk 3.8.1 numexpr 2.9.0 numpy 1.24.4 nvidia-cublas-cu12 12.1.3.1 nvidia-cuda-cupti-cu12 12.1.105 nvidia-cuda-nvrtc-cu12 12.1.105 nvidia-cuda-runtime-cu12 12.1.105 nvidia-cudnn-cu12 8.9.2.26 nvidia-cufft-cu12 11.0.2.54 nvidia-curand-cu12 10.3.2.106 nvidia-cusolver-cu12 11.4.5.107 nvidia-cusparse-cu12 12.1.0.106 nvidia-nccl-cu12 2.19.3 nvidia-nvjitlink-cu12 12.3.101 nvidia-nvtx-cu12 12.1.105 oauthlib 3.2.2 openai 1.11.1 optimum 1.16.2 ordered-set 4.1.0 orjson 3.9.13 packaging 23.2 pandas 2.2.0 pathvalidate 3.2.0 peft 0.7.1 Pillow 9.5.0 pip 23.3.1 platformdirs 4.2.0 pluggy 1.4.0 portalocker 2.8.2 protobuf 4.23.4 psutil 5.9.8 py-cpuinfo 9.0.0 pyarrow 15.0.0 pyarrow-hotfix 0.6 pyasn1 0.5.1 pyasn1-modules 0.3.0 pybind11 2.11.1 pycountry 23.12.11 pydantic 2.6.0 pydantic_core 2.16.1 pydeck 0.8.1b0 pydub 0.25.1 Pygments 2.17.2 Pympler 1.0.1 pyparsing 3.1.1 pyproject-api 1.6.1 pytablewriter 1.2.0 python-dateutil 2.8.2 python-multipart 0.0.7 pytz 2024.1 pytz-deprecation-shim 0.1.0.post0 PyYAML 6.0.1 referencing 0.33.0 regex 2023.12.25 requests 2.31.0 requests-oauthlib 1.3.1 rich 13.7.0 rootpath 0.1.1 rouge 1.0.1 rouge-score 0.1.2 rpds-py 0.17.1 rsa 4.9 sacrebleu 1.5.0 safetensors 0.4.2 scikit-learn 1.4.0 scipy 1.12.0 semantic-version 2.10.0 sentencepiece 0.1.99 sentry-sdk 1.40.0 setproctitle 1.3.3 setuptools 68.2.2 six 1.16.0 smmap 5.0.1 sniffio 1.3.0 some-package 0.1 sqlitedict 2.1.0 starlette 0.36.3 streamlit 1.24.0 sympy 1.12 tabledata 1.3.3 tabulate 0.9.0 tcolorpy 0.1.4 tenacity 8.2.3 tensorboard 2.15.1 tensorboard-data-server 0.7.2 termcolor 2.4.0 texttable 1.7.0 threadpoolctl 3.2.0 tiktoken 0.5.2 timm 0.9.12 tokenizers 0.15.1 toml 0.10.2 toolz 0.12.1 torch 2.2.0 torchaudio 2.2.0+cu121 torchvision 0.17.0 tornado 6.4 tox 4.12.1 tqdm 4.66.1 tqdm-multiprocess 0.0.11 transformers 4.37.0 transformers-stream-generator 0.0.4 triton 2.2.0 typepy 1.3.2 typing_extensions 4.9.0 tzdata 2023.4 tzlocal 4.3.1 urllib3 2.2.0 uvicorn 0.27.0.post1 validators 0.22.0 virtualenv 20.25.0 wandb 0.16.2 watchdog 4.0.0 websockets 11.0.3 Werkzeug 3.0.1 wheel 0.41.2 xxhash 3.4.1 yarl 1.9.4 zipp 3.17.0 zstandard 0.22.0

(qwen1.5) root@DESKTOP-AO2ANO7:/home/wsl/Qwen2# nvcc -V nvcc: NVIDIA (R) Cuda compiler driver Copyright (c) 2005-2023 NVIDIA Corporation Built on Wed_Nov_22_10:17:15_PST_2023 Cuda compilation tools, release 12.3, V12.3.107 Build cuda_12.3.r12.3/compiler.33567101_0

(qwen1.5) root@DESKTOP-AO2ANO7:/home/wsl/Qwen2# nvidia-smi Wed Feb 7 12:35:16 2024 +-----------------------------------------------------------------------------------------+ | NVIDIA-SMI 550.40.06 Driver Version: 551.23 CUDA Version: 12.4 | |-----------------------------------------+------------------------+----------------------+ | GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC | | Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. | | | | MIG M. | |=========================================+========================+======================| | 0 NVIDIA RTX 6000 Ada Gene... On | 00000000:34:00.0 Off | Off | | 40% 27C P8 11W / 300W | 23MiB / 49140MiB | 0% Default | | | | N/A | +-----------------------------------------+------------------------+----------------------+ | 1 NVIDIA RTX 6000 Ada Gene... On | 00000000:CA:00.0 Off | Off | | 41% 32C P8 16W / 300W | 223MiB / 49140MiB | 0% Default | | | | N/A | +-----------------------------------------+------------------------+----------------------+

emmmm Windows的环境比较复杂,难以定位问题,建议使用autodl环境和教程中一样的硬件配置和环境

lhtpluto commented 8 months ago

请详细描述你的问题,包括但不限于,在什么设备上运行的,运行的什么文件

WSL2 (qwen1.5) root@DESKTOP-AO2ANO7:/home/wsl/Qwen2# streamlit run Chatbot.py --server.address 127.0.0.1 --server.port 6006 You can now view your Streamlit app in your browser. URL: http://127.0.0.1:6006 gio: http://127.0.0.1:6006: Operation not supported Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained. 2024-02-07 12:31:57.744 Uncaught app exception (qwen1.5) root@DESKTOP-AO2ANO7:/home/wsl/Qwen2# pip list Package Version

absl-py 2.1.0 accelerate 0.24.1 aiofiles 23.2.1 aiohttp 3.9.3 aiosignal 1.3.1 altair 5.2.0 annotated-types 0.6.0 anyio 4.2.0 appdirs 1.4.4 attributedict 0.3.0 attrs 23.2.0 auto-gptq 0.6.0+cu121 autoawq 0.1.8 bitsandbytes 0.41.1 blessings 1.7 blinker 1.7.0 cachetools 5.3.2 certifi 2024.2.2 chardet 5.2.0 charset-normalizer 3.3.2 click 8.1.7 codecov 2.1.13 colorama 0.4.6 coloredlogs 15.0.1 colour-runner 0.1.1 contourpy 1.2.0 coverage 7.4.1 cramjam 2.8.1 ctransformers 0.2.27+cu121 cycler 0.12.1 DataProperty 1.0.1 datasets 2.16.1 deepdiff 6.7.1 dill 0.3.7 diskcache 5.6.3 distlib 0.3.8 distro 1.9.0 docker-pycreds 0.4.0 einops 0.7.0 exllamav2 0.0.12+cu121 fastapi 0.109.2 fastparquet 2023.10.1 ffmpy 0.3.1 filelock 3.13.1 flash-attn 2.5.0 fonttools 4.47.2 frozenlist 1.4.1 fsspec 2023.10.0 gekko 1.0.6 gitdb 4.0.11 GitPython 3.1.41 google-auth 2.27.0 google-auth-oauthlib 1.2.0 gptq-for-llama 0.1.1+cu121 gradio 3.50.2 gradio_client 0.6.1 grpcio 1.60.1 h11 0.14.0 hqq 0.1.2.post1 httpcore 1.0.2 httpx 0.26.0 huggingface-hub 0.20.3 humanfriendly 10.0 idna 3.6 importlib-metadata 6.11.0 importlib-resources 6.1.1 inspecta 0.1.3 Jinja2 3.1.2 joblib 1.3.2 jsonlines 4.0.0 jsonschema 4.21.1 jsonschema-specifications 2023.12.1 kiwisolver 1.4.5 llama_cpp_python 0.2.38+cpuavx2 llama_cpp_python_cuda 0.2.38+cu121 llama_cpp_python_cuda_tensorcores 0.2.38+cu121 lm-eval 0.3.0 Markdown 3.5.2 markdown-it-py 3.0.0 MarkupSafe 2.1.5 matplotlib 3.8.2 mbstrdecoder 1.1.3 mdurl 0.1.2 mpmath 1.3.0 multidict 6.0.5 multiprocess 0.70.15 networkx 3.2.1 ninja 1.11.1.1 nltk 3.8.1 numexpr 2.9.0 numpy 1.24.4 nvidia-cublas-cu12 12.1.3.1 nvidia-cuda-cupti-cu12 12.1.105 nvidia-cuda-nvrtc-cu12 12.1.105 nvidia-cuda-runtime-cu12 12.1.105 nvidia-cudnn-cu12 8.9.2.26 nvidia-cufft-cu12 11.0.2.54 nvidia-curand-cu12 10.3.2.106 nvidia-cusolver-cu12 11.4.5.107 nvidia-cusparse-cu12 12.1.0.106 nvidia-nccl-cu12 2.19.3 nvidia-nvjitlink-cu12 12.3.101 nvidia-nvtx-cu12 12.1.105 oauthlib 3.2.2 openai 1.11.1 optimum 1.16.2 ordered-set 4.1.0 orjson 3.9.13 packaging 23.2 pandas 2.2.0 pathvalidate 3.2.0 peft 0.7.1 Pillow 9.5.0 pip 23.3.1 platformdirs 4.2.0 pluggy 1.4.0 portalocker 2.8.2 protobuf 4.23.4 psutil 5.9.8 py-cpuinfo 9.0.0 pyarrow 15.0.0 pyarrow-hotfix 0.6 pyasn1 0.5.1 pyasn1-modules 0.3.0 pybind11 2.11.1 pycountry 23.12.11 pydantic 2.6.0 pydantic_core 2.16.1 pydeck 0.8.1b0 pydub 0.25.1 Pygments 2.17.2 Pympler 1.0.1 pyparsing 3.1.1 pyproject-api 1.6.1 pytablewriter 1.2.0 python-dateutil 2.8.2 python-multipart 0.0.7 pytz 2024.1 pytz-deprecation-shim 0.1.0.post0 PyYAML 6.0.1 referencing 0.33.0 regex 2023.12.25 requests 2.31.0 requests-oauthlib 1.3.1 rich 13.7.0 rootpath 0.1.1 rouge 1.0.1 rouge-score 0.1.2 rpds-py 0.17.1 rsa 4.9 sacrebleu 1.5.0 safetensors 0.4.2 scikit-learn 1.4.0 scipy 1.12.0 semantic-version 2.10.0 sentencepiece 0.1.99 sentry-sdk 1.40.0 setproctitle 1.3.3 setuptools 68.2.2 six 1.16.0 smmap 5.0.1 sniffio 1.3.0 some-package 0.1 sqlitedict 2.1.0 starlette 0.36.3 streamlit 1.24.0 sympy 1.12 tabledata 1.3.3 tabulate 0.9.0 tcolorpy 0.1.4 tenacity 8.2.3 tensorboard 2.15.1 tensorboard-data-server 0.7.2 termcolor 2.4.0 texttable 1.7.0 threadpoolctl 3.2.0 tiktoken 0.5.2 timm 0.9.12 tokenizers 0.15.1 toml 0.10.2 toolz 0.12.1 torch 2.2.0 torchaudio 2.2.0+cu121 torchvision 0.17.0 tornado 6.4 tox 4.12.1 tqdm 4.66.1 tqdm-multiprocess 0.0.11 transformers 4.37.0 transformers-stream-generator 0.0.4 triton 2.2.0 typepy 1.3.2 typing_extensions 4.9.0 tzdata 2023.4 tzlocal 4.3.1 urllib3 2.2.0 uvicorn 0.27.0.post1 validators 0.22.0 virtualenv 20.25.0 wandb 0.16.2 watchdog 4.0.0 websockets 11.0.3 Werkzeug 3.0.1 wheel 0.41.2 xxhash 3.4.1 yarl 1.9.4 zipp 3.17.0 zstandard 0.22.0 (qwen1.5) root@DESKTOP-AO2ANO7:/home/wsl/Qwen2# nvcc -V nvcc: NVIDIA (R) Cuda compiler driver Copyright (c) 2005-2023 NVIDIA Corporation Built on Wed_Nov_22_10:17:15_PST_2023 Cuda compilation tools, release 12.3, V12.3.107 Build cuda_12.3.r12.3/compiler.33567101_0 (qwen1.5) root@DESKTOP-AO2ANO7:/home/wsl/Qwen2# nvidia-smi Wed Feb 7 12:35:16 2024 +-----------------------------------------------------------------------------------------+ | NVIDIA-SMI 550.40.06 Driver Version: 551.23 CUDA Version: 12.4 | |-----------------------------------------+------------------------+----------------------+ | GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC | | Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. | | | | MIG M. | |=========================================+========================+======================| | 0 NVIDIA RTX 6000 Ada Gene... On | 00000000:34:00.0 Off | Off | | 40% 27C P8 11W / 300W | 23MiB / 49140MiB | 0% Default | | | | N/A | +-----------------------------------------+------------------------+----------------------+ | 1 NVIDIA RTX 6000 Ada Gene... On | 00000000:CA:00.0 Off | Off | | 41% 32C P8 16W / 300W | 223MiB / 49140MiB | 0% Default | | | | N/A | +-----------------------------------------+------------------------+----------------------+

emmmm Windows的环境比较复杂,难以定位问题,建议使用autodl环境和教程中一样的硬件配置和环境

收到 我用linux环境再试试

lhtpluto commented 8 months ago

请详细描述你的问题,包括但不限于,在什么设备上运行的,运行的什么文件

Linux环境下也出这个问题

linux ub 22.04

(qwen1.5) root@test:/home/test/qwen1.5# pip list Package Version


accelerate 0.24.1 altair 5.2.0 attrs 23.2.0 blinker 1.7.0 cachetools 5.3.2 certifi 2024.2.2 charset-normalizer 3.3.2 click 8.1.7 einops 0.7.0 filelock 3.13.1 flash-attn 2.5.0 fsspec 2024.2.0 gitdb 4.0.11 GitPython 3.1.41 huggingface-hub 0.20.3 idna 3.6 importlib-metadata 6.11.0 Jinja2 3.1.3 jsonschema 4.21.1 jsonschema-specifications 2023.12.1 markdown-it-py 3.0.0 MarkupSafe 2.1.5 mdurl 0.1.2 mpmath 1.3.0 networkx 3.2.1 ninja 1.11.1.1 numpy 1.26.4 nvidia-cublas-cu12 12.1.3.1 nvidia-cuda-cupti-cu12 12.1.105 nvidia-cuda-nvrtc-cu12 12.1.105 nvidia-cuda-runtime-cu12 12.1.105 nvidia-cudnn-cu12 8.9.2.26 nvidia-cufft-cu12 11.0.2.54 nvidia-curand-cu12 10.3.2.106 nvidia-cusolver-cu12 11.4.5.107 nvidia-cusparse-cu12 12.1.0.106 nvidia-nccl-cu12 2.19.3 nvidia-nvjitlink-cu12 12.3.101 nvidia-nvtx-cu12 12.1.105 packaging 23.2 pandas 2.2.0 Pillow 9.5.0 pip 23.3.1 protobuf 4.25.2 psutil 5.9.8 pyarrow 15.0.0 pydeck 0.8.1b0 Pygments 2.17.2 Pympler 1.0.1 python-dateutil 2.8.2 pytz 2024.1 pytz-deprecation-shim 0.1.0.post0 PyYAML 6.0.1 referencing 0.33.0 regex 2023.12.25 requests 2.31.0 rich 13.7.0 rpds-py 0.17.1 safetensors 0.4.2 setuptools 68.2.2 six 1.16.0 smmap 5.0.1 some-package 0.1 streamlit 1.24.0 sympy 1.12 tenacity 8.2.3 tokenizers 0.15.1 toml 0.10.2 toolz 0.12.1 torch 2.2.0 torchaudio 2.2.0 torchvision 0.17.0 tornado 6.4 tqdm 4.66.1 transformers 4.37.2 transformers-stream-generator 0.0.4 triton 2.2.0 typing_extensions 4.9.0 tzdata 2023.4 tzlocal 4.3.1 urllib3 2.2.0 validators 0.22.0 watchdog 4.0.0 wheel 0.41.2 zipp 3.17.0

(qwen1.5) root@test:/home/test/qwen1.5# nvcc -V nvcc: NVIDIA (R) Cuda compiler driver Copyright (c) 2005-2023 NVIDIA Corporation Built on Tue_Aug_15_22:02:13_PDT_2023 Cuda compilation tools, release 12.2, V12.2.140 Build cuda_12.2.r12.2/compiler.33191640_0

(qwen1.5) root@test:/home/test/qwen1.5# nvidia-smi Wed Feb 7 13:09:24 2024 +---------------------------------------------------------------------------------------+ | NVIDIA-SMI 545.23.08 Driver Version: 545.23.08 CUDA Version: 12.3 | |-----------------------------------------+----------------------+----------------------+ | GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC | | Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. | | | | MIG M. | |=========================================+======================+======================| | 0 NVIDIA GeForce RTX 4090 On | 00000000:01:00.0 Off | Off | | 30% 29C P8 17W / 450W | 16MiB / 24564MiB | 0% Default | | | | N/A | +-----------------------------------------+----------------------+----------------------+

+---------------------------------------------------------------------------------------+ | Processes: | | GPU GI CI PID Type Process name GPU Memory | | ID ID Usage | |=======================================================================================| | 0 N/A N/A 1713 G /usr/lib/xorg/Xorg 4MiB | +---------------------------------------------------------------------------------------+

(qwen1.5) root@test:/home/test/qwen1.5# streamlit run Chatbot.py --server.address 0.0.0.0 --server.port 80

You can now view your Streamlit app in your browser.

URL: http://0.0.0.0:80

Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained. Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained. 2024-02-07 13:04:29.427 Uncaught app exception Traceback (most recent call last): File "/root/anaconda3/envs/qwen1.5/lib/python3.11/site-packages/streamlit/runtime/caching/cache_utils.py", line 263, in _get_or_create_cached_value cached_result = cache.read_result(value_key) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/anaconda3/envs/qwen1.5/lib/python3.11/site-packages/streamlit/runtime/caching/cache_resource_api.py", line 500, in read_result raise CacheKeyNotFoundError() streamlit.runtime.caching.cache_errors.CacheKeyNotFoundError

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/root/anaconda3/envs/qwen1.5/lib/python3.11/site-packages/streamlit/runtime/caching/cache_utils.py", line 311, in _handle_cache_miss cached_result = cache.read_result(value_key) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/anaconda3/envs/qwen1.5/lib/python3.11/site-packages/streamlit/runtime/caching/cache_resource_api.py", line 500, in read_result raise CacheKeyNotFoundError() streamlit.runtime.caching.cache_errors.CacheKeyNotFoundError

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/root/anaconda3/envs/qwen1.5/lib/python3.11/site-packages/transformers/utils/import_utils.py", line 1364, in _get_module return importlib.import_module("." + module_name, self.name) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/anaconda3/envs/qwen1.5/lib/python3.11/importlib/init.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "", line 1204, in _gcd_import File "", line 1176, in _find_and_load File "", line 1147, in _find_and_load_unlocked File "", line 690, in _load_unlocked File "", line 940, in exec_module File "", line 241, in _call_with_frames_removed File "/root/anaconda3/envs/qwen1.5/lib/python3.11/site-packages/transformers/models/qwen2/modeling_qwen2.py", line 49, in from flash_attn import flash_attn_func, flash_attn_varlen_func File "/root/anaconda3/envs/qwen1.5/lib/python3.11/site-packages/flash_attn/init.py", line 3, in from flash_attn.flash_attn_interface import ( File "/root/anaconda3/envs/qwen1.5/lib/python3.11/site-packages/flash_attn/flash_attn_interface.py", line 10, in import flash_attn_2_cuda as flash_attn_cuda ImportError: /root/anaconda3/envs/qwen1.5/lib/python3.11/site-packages/flash_attn_2_cuda.cpython-311-x86_64-linux-gnu.so: undefined symbol: _ZN2at4_ops5zeros4callEN3c108ArrayRefINS2_6SymIntEEENS2_8optionalINS2_10ScalarTypeEEENS6_INS2_6LayoutEEENS6_INS2_6DeviceEEENS6_IbEE

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/root/anaconda3/envs/qwen1.5/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 552, in _run_script exec(code, module.dict) File "/home/test/qwen1.5/Chatbot.py", line 30, in tokenizer, model = get_model() ^^^^^^^^^^^ File "/root/anaconda3/envs/qwen1.5/lib/python3.11/site-packages/streamlit/runtime/caching/cache_utils.py", line 211, in wrapper return cached_func(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/anaconda3/envs/qwen1.5/lib/python3.11/site-packages/streamlit/runtime/caching/cache_utils.py", line 240, in call return self._get_or_create_cached_value(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/anaconda3/envs/qwen1.5/lib/python3.11/site-packages/streamlit/runtime/caching/cache_utils.py", line 266, in _get_or_create_cached_value return self._handle_cache_miss(cache, value_key, func_args, func_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/anaconda3/envs/qwen1.5/lib/python3.11/site-packages/streamlit/runtime/caching/cache_utils.py", line 320, in _handle_cache_miss computed_value = self._info.func(func_args, **func_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/test/qwen1.5/Chatbot.py", line 25, in get_model model = AutoModelForCausalLM.from_pretrained(mode_name_or_path, torch_dtype=torch.bfloat16, device_map="auto") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/anaconda3/envs/qwen1.5/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 565, in from_pretrained model_class = _get_model_class(config, cls._model_mapping) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/anaconda3/envs/qwen1.5/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 387, in _get_model_class supported_models = model_mapping[type(config)]


  File "/root/anaconda3/envs/qwen1.5/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 740, in __getitem__
    return self._load_attr_from_module(model_type, model_name)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/anaconda3/envs/qwen1.5/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 754, in _load_attr_from_module
    return getattribute_from_module(self._modules[module_name], attr)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/anaconda3/envs/qwen1.5/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 698, in getattribute_from_module
    if hasattr(module, attr):
       ^^^^^^^^^^^^^^^^^^^^^
  File "/root/anaconda3/envs/qwen1.5/lib/python3.11/site-packages/transformers/utils/import_utils.py", line 1354, in __getattr__
    module = self._get_module(self._class_to_module[name])
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/anaconda3/envs/qwen1.5/lib/python3.11/site-packages/transformers/utils/import_utils.py", line 1366, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import transformers.models.qwen2.modeling_qwen2 because of the following error (look up to see its traceback):
/root/anaconda3/envs/qwen1.5/lib/python3.11/site-packages/flash_attn_2_cuda.cpython-311-x86_64-linux-gnu.so: undefined symbol: _ZN2at4_ops5zeros4callEN3c108ArrayRefINS2_6SymIntEEENS2_8optionalINS2_10ScalarTypeEEENS6_INS2_6LayoutEEENS6_INS2_6DeviceEEENS6_IbEE
KMnO4-zx commented 8 months ago

我建议使用和教程中一样的autodl环境来学习,环境问题太难解决了

KMnO4-zx commented 8 months ago

我这边是正常的,没有复现出来你的报错问题。尝试一下autodl的环境,可能是本地环境的问题

image

lhtpluto commented 8 months ago

我这边是正常的,没有复现出来你的报错问题。尝试一下autodl的环境,可能是本地环境的问题

image

installed flash-attn-2.5.2(卸载2.5.0) 后问题解决