ZRayZzz / flash-attention-v100

14 stars 2 forks source link

安装时g++报错 #1

Open lucifffer opened 6 months ago

lucifffer commented 6 months ago

在安装的时候回报错,看报错信息是有两个包没编译出来,有什么解决方法么

running build_ext /usr/local/miniconda3/lib/python3.8/site-packages/torch/utils/cpp_extension.py:388: UserWarning: The detected CUDA version (11.6) has a minor version mismatch with the version that was used to compile PyTorch (11.7). Most likely this shouldn't be a problem. warnings.warn(CUDA_MISMATCH_WARN.format(cuda_str_version, torch.version.cuda)) building 'flash_attn_v100_cuda' extension creating /mnt/flash-attention-v100/build/temp.linux-x86_64-cpython-38 creating /mnt/flash-attention-v100/build/temp.linux-x86_64-cpython-38/kernel Emitting ninja build file /mnt/flash-attention-v100/build/temp.linux-x86_64-cpython-38/build.ninja... Compiling objects... Allowing ninja to set a default number of workers... (overridable by setting the environment variable MAX_JOBS=N) 1.11.1.git.kitware.jobserver-1 g++ -pthread -B /usr/local/miniconda3/compiler_compat -Wl,--sysroot=/ -pthread -shared -B /usr/local/miniconda3/compiler_compat -L/usr/local/miniconda3/lib -Wl,-rpath=/usr/local/miniconda3/lib -Wl,--no-as-needed -Wl,--sysroot=/ /mnt/flash-attention-v100/build/temp.linux-x86_64-cpython-38/kernel/fused_mha_api.o /mnt/flash-attention-v100/build/temp.linux-x86_64-cpython-38/kernel/fused_mha_kernel.o -L/usr/local/miniconda3/lib/python3.8/site-packages/torch/lib -L/usr/local/cuda/lib64 -lc10 -ltorch -ltorch_cpu -ltorch_python -lcudart -lc10_cuda -ltorch_cuda -o build/lib.linux-x86_64-cpython-38/flash_attn_v100_cuda.cpython-38-x86_64-linux-gnu.so g++: error: /mnt/flash-attention-v100/build/temp.linux-x86_64-cpython-38/kernel/fused_mha_api.o: No such file or directory g++: error: /mnt/flash-attention-v100/build/temp.linux-x86_64-cpython-38/kernel/fused_mha_kernel.o: No such file or directory error: command '/usr/bin/g++' failed with exit code 1

ZRayZzz commented 6 months ago

你用的gcc, g++版本是多少?这里面cutlass要求用c++17的。

lucifffer commented 6 months ago

你用的gcc, g++版本是多少?这里面cutlass要求用c++17的。

image 版本有问题么?我已经装到最新的了

ZRayZzz commented 6 months ago

上面的是完整日志吗?感觉应该有nvcc或者g++对应的报错才是。

lucifffer commented 6 months ago

上面的是完整日志吗?感觉应该有nvcc或者g++对应的报错才是。

running bdist_egg running egg_info writing flash_attn_v100.egg-info/PKG-INFO writing dependency_links to flash_attn_v100.egg-info/dependency_links.txt writing requirements to flash_attn_v100.egg-info/requires.txt writing top-level names to flash_attn_v100.egg-info/top_level.txt reading manifest file 'flash_attn_v100.egg-info/SOURCES.txt' writing manifest file 'flash_attn_v100.egg-info/SOURCES.txt' installing library code to build/bdist.linux-x86_64/egg running install_lib running build_py copying flash_attn_v100/init.py -> build/lib.linux-x86_64-cpython-311/flash_attn_v100 copying flash_attn_v100/flash_attn_interface.py -> build/lib.linux-x86_64-cpython-311/flash_attn_v100 running build_ext /usr/local/lib/python3.11/dist-packages/torch/utils/cpp_extension.py:425: UserWarning: There are no x86_64-linux-gnu-g++ version bounds defined for CUDA version 12.1 warnings.warn(f'There are no {compiler_name} version bounds defined for CUDA version {cuda_str_version}') building 'flash_attn_v100_cuda' extension Emitting ninja build file /root/flash-attention-v100/build/temp.linux-x86_64-cpython-311/build.ninja... Compiling objects... Allowing ninja to set a default number of workers... (overridable by setting the environment variable MAX_JOBS=N) 1.10.0 x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -g -fwrapv -O2 /root/flash-attention-v100/build/temp.linux-x86_64-cpython-311/kernel/fused_mha_api.o /root/flash-attention-v100/build/temp.linux-x86_64-cpython-311/kernel/fused_mha_kernel.o -L/usr/local/lib/python3.11/dist-packages/torch/lib -L/usr/local/cuda/lib64 -L/usr/lib/x86_64-linux-gnu -lc10 -ltorch -ltorch_cpu -ltorch_python -lcudart -lc10_cuda -ltorch_cuda -o build/lib.linux-x86_64-cpython-311/flash_attn_v100_cuda.cpython-311-x86_64-linux-gnu.so x86_64-linux-gnu-g++: error: /root/flash-attention-v100/build/temp.linux-x86_64-cpython-311/kernel/fused_mha_kernel.o: No such file or directory error: command '/usr/bin/x86_64-linux-gnu-g++' failed with exit code 1

这是完整的报错信息,缺失的文件是没编译出来么?