bowang-lab / scGPT

https://scgpt.readthedocs.io/en/latest/
MIT License
1.05k stars 207 forks source link

Request for FlashAttention v2 Support #249

Open qianachen opened 2 months ago

qianachen commented 2 months ago

Thanks for the great work!

Hi, I was wondering if it would be possible to update the current version of FlashAttention to v2 in the scGPT model. FlashAttention v2 offers several improvements in terms of memory efficiency and performance, which could be beneficial for scaling up models like scGPT, especially when working with large batch sizes or longer sequences. Are there any plans to integrate FlashAttention v2 in future releases?

Thank you for your consideration!