Dao-AILab / flash-attention

Fast and memory-efficient exact attention
BSD 3-Clause "New" or "Revised" License
13.65k stars 1.25k forks source link

支持RTX A5000加速吗 #811

Open sssuperrrr opened 8 months ago

Youhe-Jiang commented 8 months ago

支持的,我经常用A5000跑

sssuperrrr commented 8 months ago

好的谢谢

---原始邮件--- 发件人: "Youhe @.> 发送时间: 2024年2月2日(周五) 下午5:32 收件人: @.>; 抄送: @.**@.>; 主题: Re: [Dao-AILab/flash-attention] 支持RTX A5000加速吗 (Issue #811)

支持的,我经常用A5000跑

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>

dreamwish1998 commented 4 months ago

Does it support running on A6000?

Youhe-Jiang commented 4 months ago

support for all A series I think, including A10 A40 A5000 A6000 A4000 3090Ti 4090Ti A100 H100 and etc

dreamwish1998 @.***> 于2024年5月21日周二 13:33写道:

Does it support running on A6000?

— Reply to this email directly, view it on GitHub https://github.com/Dao-AILab/flash-attention/issues/811#issuecomment-2122533243, or unsubscribe https://github.com/notifications/unsubscribe-auth/AUK4KHTN2KOYRJTWW264CIDZDM5KZAVCNFSM6AAAAABCV6SXSKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCMRSGUZTGMRUGM . You are receiving this because you commented.Message ID: @.***>