NVIDIA-SMI 535.161.08 Driver Version: 535.161.08 CUDA Version: 12.2
python 3.12.4
Who can help? / 谁可以帮助到您?
No response
Information / 问题信息
[ ] The official example scripts / 官方的示例脚本
[ ] My own modified scripts / 我自己修改的脚本和任务
Reproduction / 复现过程
raise RuntimeError(
RuntimeError: FlashAttention is only supported on CUDA 11.6 and above. Note: make sure nvcc has a supported version by running nvcc -V.
torch.__version__ = 2.3.1+cu121
[end of output]
System Info / 系統信息
NVIDIA-SMI 535.161.08 Driver Version: 535.161.08 CUDA Version: 12.2 python 3.12.4
Who can help? / 谁可以帮助到您?
No response
Information / 问题信息
Reproduction / 复现过程
Expected behavior / 期待表现
昨天新加了flash attention2 代码就跑不起来了,安装https://github.com/Dao-AILab/flash-attention 但是报错以上,但显示环境已经是cuda 12了