issues
search
Dao-AILab
/
flash-attention
Fast and memory-efficient exact attention
BSD 3-Clause "New" or "Revised" License
13.39k
stars
1.22k
forks
source link
flash attention是否支持RTX8000
#944
Open
heart18z
opened
4 months ago
heart18z
commented
4 months ago
flash attention是否支持RTX8000
Panda-ka
commented
4 months ago
3090 4090 那种Ampere架构支持 ,rtx8000的架构应该不行
flash attention是否支持RTX8000