Dao-AILab / flash-attention

Fast and memory-efficient exact attention
BSD 3-Clause "New" or "Revised" License
14.47k stars 1.36k forks source link

ImportError: libtorch_cuda_cpp.so: cannot open shared object file: No such file or directory #992

Open jxxtin opened 5 months ago

jxxtin commented 5 months ago

My nvcc version is 11.7 and pytorch is 2.0.1+cu117 as follow

nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2022 NVIDIA Corporation
Built on Wed_Jun__8_16:49:14_PDT_2022
Cuda compilation tools, release 11.7, V11.7.99
Build cuda_11.7.r11.7/compiler.31442593_0

import torch
torch.>>> torch.__version__
'2.0.1+cu117'

but i got the following error, colud you give me some advise?

Traceback (most recent call last):
  File "/root/projects/MeshAnything/app.py", line 8, in <module>
    from main import get_args, load_model
  File "/root/projects/MeshAnything/main.py", line 6, in <module>
    from MeshAnything.models.meshanything import MeshAnything
  File "/root/projects/MeshAnything/MeshAnything/models/meshanything.py", line 5, in <module>
    from MeshAnything.models.shape_opt import ShapeOPTConfig
  File "/root/projects/MeshAnything/MeshAnything/models/shape_opt.py", line 2, in <module>
    from transformers.models.opt.modeling_opt import OPTForCausalLM, OPTModel, OPTDecoder, OPTLearnedPositionalEmbedding, OPTDecoderLayer
  File "/root/anaconda3/envs/meshany/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py", line 46, in <module>
    from flash_attn import flash_attn_func, flash_attn_varlen_func
  File "/root/anaconda3/envs/meshany/lib/python3.10/site-packages/flash_attn/__init__.py", line 3, in <module>
    from flash_attn.flash_attn_interface import (
  File "/root/anaconda3/envs/meshany/lib/python3.10/site-packages/flash_attn/flash_attn_interface.py", line 10, in <module>
    import flash_attn_2_cuda as flash_attn_cuda
ImportError: libtorch_cuda_cpp.so: cannot open shared object file: No such file or directory
jimzhang828 commented 5 months ago

Happened to me as well. Have you fix it?

kang2000h commented 1 day ago

In pytorch 2.1.0+cu121, following command saved me pip install --no-build-isolation flash-attn==2.5.6