ROCm / MIOpen

AMD's Machine Intelligence Library
https://rocm.docs.amd.com/projects/MIOpen/en/latest/
Other
1.05k stars 218 forks source link

Request to Add Attention Kernel to MIOpen #2199

Open zjchen77 opened 1 year ago

zjchen77 commented 1 year ago

Attention mechanisms are widely used in deep learning models, particularly in large language models. And a flexible attention kernel can help users to build accelerated language models conveniently on AMD platform. I have already designed an attention kernel for variable text lengths. I believe that adding an attention kernel to MIOpen would be a valuable enhancement for the community. Thank you for considering this request.

junliume commented 1 year ago

@zjchen77 thanks! Our CK kernels https://github.com/ROCmSoftwarePlatform/Composable_kernel already supports and in the process of refining them, Indeed exposing these kernels via MIOpen might be useful in many cases.

CAHEK7 commented 12 months ago

@JehandadKhan is this issue relater to multi-head attention as well?

JehandadKhan commented 11 months ago

@CAHEK7 That is correct.

ppanchad-amd commented 4 months ago

@junliume Has this request been added to MIOpen? Thanks!