openai / consistency_models

Official repo for consistency models.
MIT License
6.08k stars 411 forks source link

RuntimeError: FlashAttention is only supported on CUDA 11 and above #23

Closed zhouzk5 closed 1 year ago

zhouzk5 commented 1 year ago

Screenshot from 2023-04-24 15-28-06 Does anyone have the same problem? Can you help me out? I will be very grateful!

treefreq commented 1 year ago

别跑了,复现出来也跑不出像他论文里那样的图

xiongzhp commented 1 year ago

exactly the same issue

zhouzk5 commented 1 year ago

别跑了,复现出来也跑不出像他论文里那样的图 大佬已经成功复现了吗?能否指点一二?

zhouzk5 commented 1 year ago

exactly the same issue

Have you found a solution yet?

treefreq commented 1 year ago

都提示了呀,换一个高级CUDA

CallShaul commented 1 year ago

Same issue here (Kubuntu 20)

haoychen3 commented 1 year ago

I solved this problem by: export PATH=/usr/local/cuda-11.7/bin:$PATH

The cause of this error is probably that the CUDA version of nvcc (obtained by typing "nvcc -V", probably < 11.0) mismatches your CUDA version for torch (11.7).

Similar issues are solved here: https://stackoverflow.com/questions/40517083/multiple-cuda-versions-on-machine-nvcc-v-confusion

zhouzk5 commented 1 year ago

I solved this problem by: export PATH=/usr/local/cuda-11.7/bin:$PATH

The cause of this error is probably that the CUDA version of nvcc (obtained by typing "nvcc -V", probably < 11.0) mismatches your CUDA version for torch (11.7).

Similar issues are solved here: https://stackoverflow.com/questions/40517083/multiple-cuda-versions-on-machine-nvcc-v-confusion

Thank you very much! I have solved the problem.